If only it were dead dead. Not kind of dead, but actual dead. Instead, it's the walking dead. There is so much content recorded in NTSC that is still needing to be dealt with for the non-NTSC formats of today and tomorrow. So many people have done so many bad things to an NTSC video due to a lack of understanding the peculiarities of NTSC that causes us today so much pain.
NTSC was such a total kick the can down the road solution. The number of B&W sets that had been sold that forced the decision to make the signal remain compatible with those B&W sets is such a small number that we probably see that same number of sales in a week now. Hindsight and all that, but the repercussions of that decision has cost the industry so much. Then again, niche jobs exist because of it, and I've pretty much made a living from it.
Honest to goodness engineers required to operate equipment, interlacing, 3:2 pulldown, broken cadence, color burst, 1 volt peak-to-peak, IRE, waveforms/vectorscopes, pedestal, blacker than black, TBCs and video/black/chroma/hue, whiter than white, broadcast safe, chroma bleed, audio anomalies because the video signal was too strong, pincushioning to TV monitors because the video signal was too strong, and sooo many more issuess that I'm so happy to see go the way of the dodo.
From what I could tell, my TV was taking a perfect 1920x1080 signal, and instead of perfectly stretching it to 4K (4 pixels on the screen for every pixel in the input), it was first cropping the edges, then stretching.
This results in clean lines in the input becoming more "fuzzy" on the screen, which a very slight zoom effect.
It's maddening and I can't understand why it would be the default. Across my HDMI sources, TV apps, and live TV, none of these signals need any cropping. They fit perfectly with my settings.
And then we had the infamous Blue-only mode to help set the colors on the monitor. 100% bars, 75% bars. full field vs SMPTE. pluge. ramps. audio sweeps.
good grief, it's actually starting to feel like an actual funeral of people trading stories of memories good or bad. rot in hell NTSC!!
Looked at another way the NTSC color television standard was sort of a genius hack. I don't think it's creators could've imagined it would live so long.
I don't watch TV so I dunno how is the situation in my country, but I did noticed when seeing broadcasts on public TVs (for example in bars and restaurants showing sports) that compression issues happen, decoding issues happen, and the whole thing is laggy, most obvious when you are seeing sports, during analog TV times, I would hear the cheers and fireworks after goals, victories and so on, all at same time, now I hear it staggered, with seemly poorer people cheering first, I assume because they have analog TVs and radios.
Then, we got analog cable. When you changed channels, the change was still instant, and you used the same remote as for OTA TV (if you were rich enough for a remote) because the cable just plugged straight into the back of the TV. We got a bunch of channels, including the local ones from before, a handful of new channels worth watching, and a ton of others that were absolutely worthless. This was expensive, but at least cable channels were ad-free... until they were not.
Next we got digital cable. This marked a real downturn in user experience. The signal had to be processed by a box that we had to rent instead of just plugging the cable into the TV. We needed a new box and new remote for every TV. Theoretically the new remotes could replace the old ones, but that never worked out in practice. You always needed the old remote for some function not available or not programmed properly on the new one. The boxes also took forever to switch between channels. Channels now were no longer straightforward one-channel-to-one-station mappings, but you had subchannels and HD/SD versions of the same channel. My parents never understood that while they were alive and usually ended up watching the SD channel instead of the HD because it was all they could find. In fact, for them, there was usually no increase in picture quality, just in price and inconvenience.
Finally I inherited the TVs, but I never watch them, and I cancelled the cable subscription. I got a digital TV antenna, and it sometimes picks up most of the local stations, which were all that my parents ever watched, but usually it glitches or fails. Changing channels still takes forever, but at least the antenna just plugs into the back of the TV without a rented box. Of what is available on OTA television, as with what was available on cable, there isn't enough to keep my attention. Most of it is vulgar trash riddled with advertisements.
I've gone back to reading books for entertainment. They don't require any cable or antenna, no rented box, no remote control. I can get whatever I want from the library (for free!) or Amazon, and I can change books without lag. The signal never drops or glitches. I can take my books with me anywhere, including outside. I'm happier now.
Digital cable (QAM or ClearQAM) came across the same wire as the rest of the stuff...cable internet, encrypted digital cable, etc. and you could tune it with any compatible ATSC tuner like the one built into most TVs and loads of cheap capture cards.
I always hated the "digital cable boxes" required for encrypted content, but for a while, an old PC with a capture card and Windows Media Center let me have a much nicer interface and DVR without the extra rental costs. In fact, I once lived in a building that was split into three apartments and we found that as long as one of us had cable internet, we also had ClearQAM cable TV...
...which I imagine is why they got rid of it. Cable modems need to be provisioned. Cable boxes need to be rented if you want to decrypt the channels you'd paid to authorize decryption of. But QAM was just on the wire. So easy to use and so easy to mooch, so the cable company used the confusion around the switch from OTA analog/digital switch to also remove QAM and switch everyone over to those rented decoder boxes.
Incidentally, it's also when I stopped watching cable TV. The boxes were an additional cost for a worse experience than I had with the built-in tuner or my HTPC. I thought about getting a new capture card with CableCard support, but that standard was a huge pain in the ass and never really took off outside of Tivo and similar turnkey devices. Thankfully, it's also when streaming and other OTT services started to take off.
Now I just stream it. At least I don't have to think about it, but I don't "have" copies of it in the same way - but... the more decades go by, the less of that old stuff I recorded in 2010 that I'd ever watch again from those old hard drives anyway.
Think of streaming or video on demand services where its now up to five seconds to change.
It's certainly possible to build non-causal signal processors for broadcast TV by using multiple tuners that buffer channel N+1 while you're watching channel N. You could do it for streaming services too by using statistics (or--dare I say it--AI) to predict what movie the user is most likely to sample next.
It might make the decoder cost an extra $2, and that's probably why nobody does it.
Ultimately the problem is that with old analog signals you just needed to wait for the next V-Blank (4 hundreds of a second at the most) and you were good to go. With digital signals you need to wait for the I frame which can take dozens or hundreds of frames. And this assumes you have a good solid signal and can decode everything perfectly, if the set is trying to piece together a fragmentary signal it's even harder.
That said, the friction among modern devs to go from zero to cross-platform app is much lower with Electron than it is any other framework. So learn to love Electron because it is here to stay.
Cable systems in the USA took advantage of more frequencies than were available over terrestrial television to get more channels. This is because the cable signal was still RF, but not subject to radio interference in any significant way because it came through a cable. The upshot of this was that unless your TV was "cable-ready", i.e., capable of tuning the additional frequencies, in order to use even analog cable you needed a set-top box. It took a few years for cable-ready TVs to hit the US market -- but by the 90s they were ubiquitous.
Maybe television stations did not care enough about broadcast anymore to invest in digital broadcasting capability that matched the power of their old analog broadcasts.
This may have coincided with lower investment in broadcast, anyway. Replacing the transmitting equipment, don't care about providing as much range as the much older, analog equipment did, because the per-viewer value has dropped enough that it's not worth it to reach rural areas, so you cheap out on both equipment & broadcast strength.
Another is simply self-selection bias. When a rapid technical transition takes place, the average reception might improve significantly, but you can't guarantee it for everyone. A slight frequency change, a reflection pattern can drastically alter the signal intensity in a certain spot. But you will only hear from people who happened to have good analog and lost it, those with bad analog had already moved to cable and are unaware that good terrestrial digital has become available.
My experience with Freeview in the greater London area was very positive, with a very simple antenna I was able to get around 100 channels, the vast majority in perfect quality.
The newest ATSC3.0 standard has a somewhat richer toolbox allowing the optional use of 1024 or 4096QAM vs 256QAM for DVB, more advanced LDPC error correcting codes, LDM multiplexing and MIMO. The industry seems to believe the added complexity is not worth the marginal benefits.
The fundamental difference between Europe (or at least UK) and US seems to be the concern regulators had for improving over the air TV. So while in the UK, the move to digital was employed to broadly improve the offering and reception quality, in the US there was less concern and the stations found the death of analog a good opportunity to close down transmitters, recover spectrum and remove free programing.
The Shannon limit for a 5Mhz channel with 20 dB SNR is 17 Mbit/s and latest DTV standards approach this, easily 10, maybe 15 Mbit. That's enough for a HD channel in perfect quality or many more in SD.
So the problem is not with analog vs digital, that debate is long settled. It's the specific tradeoffs made when digital TV was deployed.
ATSC 3.0 is supposed to fix this, but brings with it its own feature bloat.
In reality, the switch from Analog to Digital means that while people in Urban areas have more clearer channels, people in rural areas (who could watch snow covered analog TV) now have fewer if any channels.
Even if a mathematical calculation says that my TV should work great, I still have nothing if I can't pick up any channels.
I don't think asking people to run a proper antenna is asking too much, especially for all the benefits digital brings in.
Digital if the stream desyncs it pretty much will be out for 30-40 seconds at least for me. Then repeat every few mins. With analog while the signal was bad you could in a pinch still watch it. I knew people who watched 2nd-3rd gen VHS copies of movies at that time. They were not that picky. The color was gone and the sound worbled but they were just happy to be able to watch the movie. But if it just glitched out and died they would not have bothered. It was also they just did not know better was out there. I would show people a DVD and VHS back to back on a SD TV and they suddenly 'got it' that they had been watching snow. We look back on it now and see rubbish but at the time they were just happy to get anything.
With analog broadcast, if you were a bit too far away you would still get a fuzzy image and sounds that had some interference with it. Now if you're a bit too far away the picture breaks up and the audio sounds like auto-tune. It doesn't take much of that to make it unwatchable.
On the other hand, when you get a solid signal the picture is amazing versus the standard-def of analog broadcasts. Add to this the benefit of multiple sub-channels on each broadcast channel and you get a lot more watching options.
I've installed an attic antenna and a pre-amplifier that supplies great signal to all coax (CATV) outlets in the house. That hardware cost what 1-2 months of cable TV subscription would have and that was years and years ago.
On balance the digital changeover has been much better in my experience.
Subchannels is kind of nice, but junky content with junky compression is worse than a more limited set of snowy channels. Now, when they do broadcast stuff with enough bandwidth, and you're above the cliff the whole time, it's nice.
There's something very interesting about this concept of "cyber-sync" - when is now? Is your now different from mine?
When digital signals "fail" (usually just minor interference or some power fluctuation), you usually just get a black screen or a picture that cuts in and out. When analog signals "failed" (in the same interference or fluctuation) then you likely got a distorted image/sound/colour for a few seconds but it didn't really interrupt your viewing experience.
It is an interesting observation that it seems in areas where there was still analog and digital transmission in parallel, that the "live" analog signal was "faster".
Also ISTB has the idea of a 'mobile' channel, which can be useful in low SNR conditions (it is worse for pretty much anything else, though)
ISDB-T modulation scheme is very interesting. It is comprised of 13 sub-carriers ("segments"), and by spec, one of those can be the "mobile" channel (called 1seg, as it occupies one of the 13 segments and leaves 12 for the main mux). It can have different modulation parameters from the 12 main sub-carriers, so you can make it more robust with less available bandwidth and make the main mux less robust but with more bandwidth.
IIRC there are other ways to distribute the ISDB-T segments but I don't remember how out of spec it would be. Most 13-segment receivers (ie, TVs, set-top boxes, some USB adapters) wouldn't care (at least 10 years ago they didn't care) and it would work, though.
Importing a JPN/US console was not only to get access to region-exclusive titles!
Good thing that's over.
That said, for broadcast TV, PAL looked far nicer - NTSC was named 'Never The Same Colour' for a reason.
I feel old.
(SCART is another one of these btw, it means Syndicat des Constructeurs d'Appareils Radiorécepteurs et Téléviseurs (Radio and Television Receiver Manufacturers' Association) and it was supposed to be called péritel (insert IBA ENG video discussing SCART) but the name stuck.)
* And Canada and...
In retrocomputing terms, NTSC is something I don't mind having around and am able to deal with.
In watching TV / video terms? Yeah, NTSC is craaaaaappy! I won't miss it either, though I do enjoy the look on some programs.
DTV is pretty great. I live in a region where signals are reasonable, and it's just better overall. For the few hours I watch TV, DTV gets it done.
On the analog vs digital front, for things like embedded projects, I really dislike digital video. It's resource intensive. Analog RGB or Component is EASY, lean, and clean.
Right now, the golden time is ending. If you want analog video gear, good CRT's that sort of thing, it's possible still, but starting to get expensive. I nabbed a nice multi-format PVM for my retro gear / computing experiences, and a few other displays, and a couple TV's to modify to make an arcade cab, with a real, fast, gorgeous CRT.
Now is the time! Won't last long. If you want analog, get it. Soon, it will be a curio, rare, expensive.
However 1080 is bigger than 720, so most sources are interlaced.
4K at least got ride of interlaced as an option, so 2160 is always progressive.
There's a substantial difference in bitrate between OTA broadcast (for the main channel) and the cable broadcast. Typical cable will stat-mux several channels down to well below 10 Mbps/channel. Main OTA tends to be around 15 Mbps (for a different encoding) but not stat-muxed. So to some extent, the quality of the channel you're watching on cable/fiber depends on what is going on in the adjacent channels.
AFAIK, pretty much all cable transmissions are MPEG2 (and analog for a long time before that). It's a little nuts considering all the benefits and readily available hardware decoders of new standards. A H.264 stream would look WAY better with far less bits.
For most programs most of the time an audio loss or glitch is more damaging than a video loss or glitch. If they had given more bandwidth to audio so they could use more robust ECC ATSC would have been a lot more watchable for those of us who don't have strong signals.
The issue we have is that just now, ATSC 1.0 has become the standard everywhere, yet that was a standard from 1995. As you could imagine, we've come a LONG way in the last 26 years in terms of codec capabilities.
If we could just get people to upgrade equipment... ATSC allows for much more robust error handling but it's not used because few devices support it.
Right now it seems like the only people deploying it are paid services (which, IMO, shouldn't be legal. We shouldn't have to share public broadcast frequencies with private corps).
The issue is clearly that the digital signals are being compressed to within an inch of their lives, since the poor quality is due to obvious compression artifacts.
So, I don't think it's a technology problem, I think it's a greed problem: the TV stations are trying to subdivide their frequency allotment so they can fit as many different video streams as possible. Either way, it's still a problem OTA broadcasts in my area still border on unwatchable.
Fortunately, that was also around the time that I stopped watching TV, so it wasn't a real problem for me (which is good, because the new digital signals literally did not come to my neighborhood -- we went from analog TV to no OTA TV at all). But it was impossible to not notice that broadcast TV quality became orders of magnitude worse, pretty much overnight.
To those living in US, is ATSC 3.0 a thing yet?
It is funny it mention DVB-T2 as counter part in other parts of the world but for many part of EUR, DVB-T2 is now over a decade in use. Although the earlier DVB-T2 are stuck using AVC while Germany and France has newer DVB-T2 with HEVC variant.
Or do Americans generally dont care about Free to View TV and only use cables or any other Pay per view network?
but I'm thinking now...i could at least build something from the ground up. i dont know how much machinery i would have to build to talk to an hdtv connector - maybe i would even have to apply for a key.