Hacker News new | past | comments | ask | show | jobs | submit login
End of an Era: NTSC Finally Goes Dark in America (hackaday.com)
129 points by woldemariam 17 days ago | hide | past | favorite | 94 comments



An the angels sang!

If only it were dead dead. Not kind of dead, but actual dead. Instead, it's the walking dead. There is so much content recorded in NTSC that is still needing to be dealt with for the non-NTSC formats of today and tomorrow. So many people have done so many bad things to an NTSC video due to a lack of understanding the peculiarities of NTSC that causes us today so much pain.

NTSC was such a total kick the can down the road solution. The number of B&W sets that had been sold that forced the decision to make the signal remain compatible with those B&W sets is such a small number that we probably see that same number of sales in a week now. Hindsight and all that, but the repercussions of that decision has cost the industry so much. Then again, niche jobs exist because of it, and I've pretty much made a living from it.

Honest to goodness engineers required to operate equipment, interlacing, 3:2 pulldown, broken cadence, color burst, 1 volt peak-to-peak, IRE, waveforms/vectorscopes, pedestal, blacker than black, TBCs and video/black/chroma/hue, whiter than white, broadcast safe, chroma bleed, audio anomalies because the video signal was too strong, pincushioning to TV monitors because the video signal was too strong, and sooo many more issuess that I'm so happy to see go the way of the dodo.


There is one unfortunate artifact from NTSC that is still with us: overscan. For some reason TVs will take a digital signal from HDMI, crop off the edges, and stretch the image. So some video game consoles and streaming devices allow you to shrink their output to compensate. It’s madness.

https://en.m.wikipedia.org/wiki/Overscan


I turned this off on my TV and was just like...wow! So much better!

From what I could tell, my TV was taking a perfect 1920x1080 signal, and instead of perfectly stretching it to 4K (4 pixels on the screen for every pixel in the input), it was first cropping the edges, then stretching.

This results in clean lines in the input becoming more "fuzzy" on the screen, which a very slight zoom effect.

It's maddening and I can't understand why it would be the default. Across my HDMI sources, TV apps, and live TV, none of these signals need any cropping. They fit perfectly with my settings.


legacy is difficult to overcome. let's call it the granddaddy of all tech debt


Good ol' overscan helped "hide" the horizontal and vertical blanking. Litterally turning on/off the scanning beam of electrons while it retraced from left to right and from bottom to top. the number of milliseconds each of those on/off blinks required. lots of monitors had an underscan mode and professional monitors even had H/V delays where when all of them were enabled, you'd get the cross pulse mode that would allow you to see lots of interesting info that lived in the overscan portions much more easily. things like VITC and CC "marching ants" in the scan lines outside of active picture. there were even ID tags that could get stored into the lines.

And then we had the infamous Blue-only mode to help set the colors on the monitor. 100% bars, 75% bars. full field vs SMPTE. pluge. ramps. audio sweeps.

good grief, it's actually starting to feel like an actual funeral of people trading stories of memories good or bad. rot in hell NTSC!!


It probably wasn't how many B&W sets were sold, but who owned them, that drove the compatibility. Sets being expensive luxury items meant you didn't want to upset the existing userbase.


I wasn't meaning that it was the number alone that set the direction. Yes, they did not want to deal with bad PR of telling everyone that your very expensive device that you've had for no more than 2 years is now antiquated requiring the purchase of a new device. Could you possibly imagine the mass public spending $1000+ dollars each year on a brand new device? The PR hit from the masses must be extremely negative that no industry would ever do that, right? Some device that is nothing but entertainment? Each year? Replacing a fully functional device annually? No, surely the masses would never accept that.


Except that $1000 in early 1950s is equivalent to $10k today. Would you spend that every year? (typed on a 6 year old iPhone 6S)


> NTSC was such a total kick the can down the road solution.

Looked at another way the NTSC color television standard was sort of a genius hack. I don't think it's creators could've imagined it would live so long.


NTSCobol


Interesting to see the angry comments about how Digital TV fails more than Analog.

I don't watch TV so I dunno how is the situation in my country, but I did noticed when seeing broadcasts on public TVs (for example in bars and restaurants showing sports) that compression issues happen, decoding issues happen, and the whole thing is laggy, most obvious when you are seeing sports, during analog TV times, I would hear the cheers and fireworks after goals, victories and so on, all at same time, now I hear it staggered, with seemly poorer people cheering first, I assume because they have analog TVs and radios.


I grew up with analog OTA TV. When you changed channels, the change was instant. We got a handful of local channels.

Then, we got analog cable. When you changed channels, the change was still instant, and you used the same remote as for OTA TV (if you were rich enough for a remote) because the cable just plugged straight into the back of the TV. We got a bunch of channels, including the local ones from before, a handful of new channels worth watching, and a ton of others that were absolutely worthless. This was expensive, but at least cable channels were ad-free... until they were not.

Next we got digital cable. This marked a real downturn in user experience. The signal had to be processed by a box that we had to rent instead of just plugging the cable into the TV. We needed a new box and new remote for every TV. Theoretically the new remotes could replace the old ones, but that never worked out in practice. You always needed the old remote for some function not available or not programmed properly on the new one. The boxes also took forever to switch between channels. Channels now were no longer straightforward one-channel-to-one-station mappings, but you had subchannels and HD/SD versions of the same channel. My parents never understood that while they were alive and usually ended up watching the SD channel instead of the HD because it was all they could find. In fact, for them, there was usually no increase in picture quality, just in price and inconvenience.

Finally I inherited the TVs, but I never watch them, and I cancelled the cable subscription. I got a digital TV antenna, and it sometimes picks up most of the local stations, which were all that my parents ever watched, but usually it glitches or fails. Changing channels still takes forever, but at least the antenna just plugs into the back of the TV without a rented box. Of what is available on OTA television, as with what was available on cable, there isn't enough to keep my attention. Most of it is vulgar trash riddled with advertisements.

I've gone back to reading books for entertainment. They don't require any cable or antenna, no rented box, no remote control. I can get whatever I want from the library (for free!) or Amazon, and I can change books without lag. The signal never drops or glitches. I can take my books with me anywhere, including outside. I'm happier now.


There was a little while where we still had unencrypted digital cable (in the US at least). I'm sure it's still around in some places, but we lost the option years ago where I live.

Digital cable (QAM or ClearQAM) came across the same wire as the rest of the stuff...cable internet, encrypted digital cable, etc. and you could tune it with any compatible ATSC tuner like the one built into most TVs and loads of cheap capture cards.

I always hated the "digital cable boxes" required for encrypted content, but for a while, an old PC with a capture card and Windows Media Center let me have a much nicer interface and DVR without the extra rental costs. In fact, I once lived in a building that was split into three apartments and we found that as long as one of us had cable internet, we also had ClearQAM cable TV...

...which I imagine is why they got rid of it. Cable modems need to be provisioned. Cable boxes need to be rented if you want to decrypt the channels you'd paid to authorize decryption of. But QAM was just on the wire. So easy to use and so easy to mooch, so the cable company used the confusion around the switch from OTA analog/digital switch to also remove QAM and switch everyone over to those rented decoder boxes.

Incidentally, it's also when I stopped watching cable TV. The boxes were an additional cost for a worse experience than I had with the built-in tuner or my HTPC. I thought about getting a new capture card with CableCard support, but that standard was a huge pain in the ass and never really took off outside of Tivo and similar turnkey devices. Thankfully, it's also when streaming and other OTT services started to take off.


Digital cable boxes with Firewire were a fun-but-super-tedious way for a brief few years to be able to record even digital cable content to a PC.

Now I just stream it. At least I don't have to think about it, but I don't "have" copies of it in the same way - but... the more decades go by, the less of that old stuff I recorded in 2010 that I'd ever watch again from those old hard drives anyway.


Latency always goes up with technology.

Think of streaming or video on demand services where its now up to five seconds to change.


> Latency always goes up with technology.

It's certainly possible to build non-causal signal processors for broadcast TV by using multiple tuners that buffer channel N+1 while you're watching channel N. You could do it for streaming services too by using statistics (or--dare I say it--AI) to predict what movie the user is most likely to sample next.

It might make the decoder cost an extra $2, and that's probably why nobody does it.


Seems like you'd also need a second tuner which would increase the cost a bit more.

Ultimately the problem is that with old analog signals you just needed to wait for the next V-Blank (4 hundreds of a second at the most) and you were good to go. With digital signals you need to wait for the I frame which can take dozens or hundreds of frames. And this assumes you have a good solid signal and can decode everything perfectly, if the set is trying to piece together a fragmentary signal it's even harder.


Possibly there are some trick appliable to improve channel switching performance (like include low-res I-like frame frequently?), but no one have incentive to improve zapping performance.


It does happen on computing too. Even on very recent machines, very recent OS kernels you can notice the layers. Very odd to see.


Because the developers are lazy. Why do we need 20 layers of libraries to draw a rectangle ? Give a developer a library and he will use it.


Give the developer no library and they will find a way to slam the bits into VRAM.

That said, the friction among modern devs to go from zero to cross-platform app is much lower with Electron than it is any other framework. So learn to love Electron because it is here to stay.


Ah yes, I've been without watching TV for a good 25 years now. I got Netflix on and off a few times, but most of the shows are such a brainwash that it's not worth it. Books and people, that's it for me now.


Well, ackshually...

Cable systems in the USA took advantage of more frequencies than were available over terrestrial television to get more channels. This is because the cable signal was still RF, but not subject to radio interference in any significant way because it came through a cable. The upshot of this was that unless your TV was "cable-ready", i.e., capable of tuning the additional frequencies, in order to use even analog cable you needed a set-top box. It took a few years for cable-ready TVs to hit the US market -- but by the 90s they were ubiquitous.


It is worse if you have poor signal. Digital either works and looks great or it's not watchable. An analog station might look kinda snowy but be completely watchable. I used to be able to get a few channels before the switchover on my bedroom TV with rabbit ears, after the switchover I got 1 or 2 on a good day because the signal strength was just not strong enough.


Snowy analog with intelligible sound translates to about 40dB signal to noise ratio. That's a SNR where digital will deliver crystal clear image. When digital becomes glitchy, you will absolutely be unable to watch analogue, you will have trouble maintaining vertical and horizontal sync and the sound will be drowned in noise.


There must be some other factor at work, then, because I had the same experience when I didn't have cable around 2010. For a few big sports events I tried to get broadcasts, and analog was watchable (by analog standards) when digital was unwatchable because of frequent glitching. Buying a fancy antenna did not help.

Maybe television stations did not care enough about broadcast anymore to invest in digital broadcasting capability that matched the power of their old analog broadcasts.


I bet a bunch of them did some naïve calculation that said "because digital performs better, we can reduce broadcast power by X, for a savings of $Y, without loss of range", but real-world conditions meant this actually did reduce their broadcast range a little (though mostly in low-population areas they barely cared about anyway).

This may have coincided with lower investment in broadcast, anyway. Replacing the transmitting equipment, don't care about providing as much range as the much older, analog equipment did, because the per-viewer value has dropped enough that it's not worth it to reach rural areas, so you cheap out on both equipment & broadcast strength.


There are many factors that come into this. The bandwidth of a single analog channel can accommodate 4 or more digital channels. If the transmitter is not upgraded, its power will now be spread to multiple channels.

Another is simply self-selection bias. When a rapid technical transition takes place, the average reception might improve significantly, but you can't guarantee it for everyone. A slight frequency change, a reflection pattern can drastically alter the signal intensity in a certain spot. But you will only hear from people who happened to have good analog and lost it, those with bad analog had already moved to cable and are unaware that good terrestrial digital has become available.

My experience with Freeview in the greater London area was very positive, with a very simple antenna I was able to get around 100 channels, the vast majority in perfect quality.


The people I hear from had bad analog. But you can watch bad analog; your TV is just covered in snow. Bad digital is unwatchable.


Does ATSC/NTSC work the same way as DVB-T/PAL with regards to signal quality at the same DB?


They are built upon the same basic principles - QAM modulation, error correcting codes and MPEG/HEVC codecs, so there should be no significant difference.

The newest ATSC3.0 standard has a somewhat richer toolbox allowing the optional use of 1024 or 4096QAM vs 256QAM for DVB, more advanced LDPC error correcting codes, LDM multiplexing and MIMO. The industry seems to believe the added complexity is not worth the marginal benefits.

The fundamental difference between Europe (or at least UK) and US seems to be the concern regulators had for improving over the air TV. So while in the UK, the move to digital was employed to broadly improve the offering and reception quality, in the US there was less concern and the stations found the death of analog a good opportunity to close down transmitters, recover spectrum and remove free programing.


It is. A so called digital signal is still an analog signal which must be processed. You only obtain the digital signal after you process it. The main issue here is that the digital signal is much more complex and due to this much sensible to noise. When it rains for example the analog signal will still be visible as long as the sync part can be decoded. For digital signal noise will be replaced by predicted frames. The issue is that a lot of providers have a very bad quality and they rely heavily on predicted frames. And when the key frames are not available (due to noise) good luck predicting the rest of the frames.


The difference between sync level and black level in analog TV standards is about 15% of maximum transmitter power. This means that any noise added that is more than 7.5% of max power will confuse the TV to detect sync as black and black as sync. In practice, a 20dB SNR or less will be completely unwatchable.

The Shannon limit for a 5Mhz channel with 20 dB SNR is 17 Mbit/s and latest DTV standards approach this, easily 10, maybe 15 Mbit. That's enough for a HD channel in perfect quality or many more in SD.

So the problem is not with analog vs digital, that debate is long settled. It's the specific tradeoffs made when digital TV was deployed.


Actually, thanks to error correcting codes, a digital signal can get right up against the Shannon limit. You can't improve on the Shannon limit.


Many, if not most stations switched broadcast frequency during the transition from analog to digital.


One thing I don't see noted yet in the replies -- ATSC 1.0 (the current US standard) has awful difficulties with multipath signals (reflections off of buildings, terrain, etc.)

ATSC 3.0 is supposed to fix this, but brings with it its own feature bloat.


One problem I've recently become aware of is that LTE and 5g cellular signals are overwhelming the digital TV signals by raising the noise floor in the amplifier stage even though they don't use the same frequencies.


So that may be true, in theory.

In reality, the switch from Analog to Digital means that while people in Urban areas have more clearer channels, people in rural areas (who could watch snow covered analog TV) now have fewer if any channels.

Even if a mathematical calculation says that my TV should work great, I still have nothing if I can't pick up any channels.


Even in urban areas we lost broadcasters in the transition to digital. It is just worse in a lot of ways. I don't care that my morning weather report is clearer in picture, if it stutters seemingly randomly. It's not as far as unintelligible, just fantastically irritating.


The problem is that many stations migrated from well propagating VHF to UHF with the digital migration. The higher frequency signal experiences more fade due to atmospheric conditions and ground clutter. They also jam multiple channels into the same bandwidth the single old channel used to occupy, further reducing EIRP per channel.


VHF -> UHF transition is same here in Japan. After digital TV transition, empty VHF frequencies considered to worth for something like broadcast service for mobile device, so the service(NOTTV) started. But the service is failed due to not attracting (and partially because iPhone don't support it), so now not utilized again. How VHF frequencies utilized in other countries?


I remember old "snowy" tv and absolutely hated it. Bad analog is still a really bad experience, especially with audio. Worse, it never seems to be just "snowy" but the signal goes in and out randomly so it goes from bad to unwatchable quickly and somehow always during important parts of the show. ;) I think people tend to play up idealized versions of analog tv and downplay how bad digital signal issues can be, a bit like how people hold up questionable movie special effects from the 70s and 80s as being great but also think all CGI is fake looking.

I don't think asking people to run a proper antenna is asking too much, especially for all the benefits digital brings in.


This is a good point. But having done both...

Digital if the stream desyncs it pretty much will be out for 30-40 seconds at least for me. Then repeat every few mins. With analog while the signal was bad you could in a pinch still watch it. I knew people who watched 2nd-3rd gen VHS copies of movies at that time. They were not that picky. The color was gone and the sound worbled but they were just happy to be able to watch the movie. But if it just glitched out and died they would not have bothered. It was also they just did not know better was out there. I would show people a DVD and VHS back to back on a SD TV and they suddenly 'got it' that they had been watching snow. We look back on it now and see rubbish but at the time they were just happy to get anything.


I still use an over-the-air antenna on a daily basis (and record from it using MythTV with ATSC tuners in a computer). The changeover from analog has been a bit of a give and take.

With analog broadcast, if you were a bit too far away you would still get a fuzzy image and sounds that had some interference with it. Now if you're a bit too far away the picture breaks up and the audio sounds like auto-tune. It doesn't take much of that to make it unwatchable.

On the other hand, when you get a solid signal the picture is amazing versus the standard-def of analog broadcasts. Add to this the benefit of multiple sub-channels on each broadcast channel and you get a lot more watching options.

I've installed an attic antenna and a pre-amplifier that supplies great signal to all coax (CATV) outlets in the house. That hardware cost what 1-2 months of cable TV subscription would have and that was years and years ago.

On balance the digital changeover has been much better in my experience.


> On the other hand, when you get a solid signal the picture is amazing versus the standard-def of analog broadcasts. Add to this the benefit of multiple sub-channels on each broadcast channel and you get a lot more watching options.

Subchannels is kind of nice, but junky content with junky compression is worse than a more limited set of snowy channels. Now, when they do broadcast stuff with enough bandwidth, and you're above the cliff the whole time, it's nice.


Yeah, I think it’s a little ironic that the analog error by necessity had lower latency because buffers were expensive. In someways the world as a whole had these brief experiences of synchronized emotion… everyone hearing the same exact thing within a second that with a long digital buffers caused by cheap memory we no longer experience… buffers can be several seconds, maybe over a minute in total even for live events. Buffer bloat killed cybernetic synchronicity. ;)


> Buffer bloat killed cybernetic synchronicity.

There's something very interesting about this concept of "cyber-sync" - when is now? Is your now different from mine?


The differences in latency from the live event to people seeing it were pretty apparent in the UK during the recent Euro 2020 tournament - I was surprised how many were evidently watching on internet streaming services by the length of the delays in reaction vs what I saw via satellite (the delay for satellite vs terrestrial/cable is small by comparison, especially since both of those went digital). If you want the lowest latency to the live event, analogue radio is probably still the best for these types of things.


The BBC 4K streams were a good 70 seconds behind broadcast. Fastest/earliest stream is SD on Freeview from my testing.


I remember the standard delay for Analog or Satellite Broadcast were 3 - 5 secs due to some betting arrangement. So I really dont see why Internet Broadcasting cant be within that limit as well.


It probably has to do with the failure mode rather than actual rate of failure, although it may be perceived differently.

When digital signals "fail" (usually just minor interference or some power fluctuation), you usually just get a black screen or a picture that cuts in and out. When analog signals "failed" (in the same interference or fluctuation) then you likely got a distorted image/sound/colour for a few seconds but it didn't really interrupt your viewing experience.

It is an interesting observation that it seems in areas where there was still analog and digital transmission in parallel, that the "live" analog signal was "faster".


I'm not so sure what kind of error-correction ATSC has compared to DTB and ISTB standards, but the 8VSB modulation is not particularly great

Also ISTB has the idea of a 'mobile' channel, which can be useful in low SNR conditions (it is worse for pretty much anything else, though)


ATSC has only one FEC parameter compared to DVB-T and ISDB-T, which also have more robust modulation schemes. ISDB-T includes an interleaver that makes it more resilient to impulsive noise compared to DVB-T.

ISDB-T modulation scheme is very interesting. It is comprised of 13 sub-carriers ("segments"), and by spec, one of those can be the "mobile" channel (called 1seg, as it occupies one of the 13 segments and leaves 12 for the main mux). It can have different modulation parameters from the 12 main sub-carriers, so you can make it more robust with less available bandwidth and make the main mux less robust but with more bandwidth.

IIRC there are other ways to distribute the ISDB-T segments but I don't remember how out of spec it would be. Most 13-segment receivers (ie, TVs, set-top boxes, some USB adapters) wouldn't care (at least 10 years ago they didn't care) and it would work, though.


Partial success of signal mattered in rural emergencies, but with ubiquitous cell phones the need disappeared. With improving internet coverage providing on demand media, the need for over the air television is shrinking and the spectrum can be better used.


And there's still best broadcast method for emergency: analog radio


NTSC was the envy of many European console gamers back in the pre-HD era, because a lot of game ports to PAL (50Hz) were dodgy and no longer ran at the speed the original game designers had intended, and/or were squashed between black borders at the top and bottom of the TV screen.

Importing a JPN/US console was not only to get access to region-exclusive titles!


Funny thing: it was the other way around for many C64 games. Many NTSC ports of the time played too fast and had aspect ratio issues. Music from many EU Scene Demos was 20% faster on a US C64.

Good thing that's over.


Yeah, having access to a TV / monitor that supported 60Hz was gold in the UK.

That said, for broadcast TV, PAL looked far nicer - NTSC was named 'Never The Same Colour' for a reason.


I grew up in the part of Asia that PAL was the standard. Kids beg their parents to buy an NTSC TV so that they can play Nintendo (Japanese version: the red and white one).

I feel old.


I had an apartment in Sydney for awhile and got to experience PAL for the first time. I remember it looking really nice but it had this kind of uncomfortable strobing effect.


My first thought when reading the headline: "why are they referring to the analog TV switch-off by using the name for the color TV standard?". I guess most of us associate NTSC with the color TV standard because of the competing NTSC/PAL/SECAM standards, but they also developed the original black-and-white analog TV standard, which was then extended in a backward-compatible way to allow for color.


Huh. What is the right term them... just "analogue TV broadcasting"?


I mean this is America-specific so "NTSC" is fine, especially that it is still used as a shorthand for "American standards for standard definition" in certain areas of videography.


Oh sure, we all knew what the article meant. I just wonder what the correct technical term encompassing the US analogue TV broadcasting would have been, as opposed to the system used in Europe, Japan, etc...


Formally? Unambiguously, CCIR System M is the international formal designation for the US* format of the signal. It does not state what color format it uses - PAL-M (in Brazil) is such a frankenstein format that is not seen anywhere else - but domestically-speaking, NTSC (is confusingly) both the B&W and the backwards-compatible color format, and if someone didn't bother to change the name to ATSC (A standing for Advanced, not American), we could have NTSC Digital 3.0 instead. NTSC (National Television Systems Committee) is the committee which formulated these standards, they didn't bother to put a more proper name to their inventions like SCART's (the association, see below) SECAM and Telefunken's PAL have, or in the digital domain EBU's DVB and ARIB's ISDB.

(SCART is another one of these btw, it means Syndicat des Constructeurs d'Appareils Radiorécepteurs et Téléviseurs (Radio and Television Receiver Manufacturers' Association) and it was supposed to be called péritel (insert IBA ENG video discussing SCART) but the name stuck.)

* And Canada and...


Contrary to majority opinion, I like NTSC. It's a fun signal with a lot of hacks possible.

In retrocomputing terms, NTSC is something I don't mind having around and am able to deal with.

In watching TV / video terms? Yeah, NTSC is craaaaaappy! I won't miss it either, though I do enjoy the look on some programs.

DTV is pretty great. I live in a region where signals are reasonable, and it's just better overall. For the few hours I watch TV, DTV gets it done.

On the analog vs digital front, for things like embedded projects, I really dislike digital video. It's resource intensive. Analog RGB or Component is EASY, lean, and clean.

Right now, the golden time is ending. If you want analog video gear, good CRT's that sort of thing, it's possible still, but starting to get expensive. I nabbed a nice multi-format PVM for my retro gear / computing experiences, and a few other displays, and a couple TV's to modify to make an arcade cab, with a real, fast, gorgeous CRT.

Now is the time! Won't last long. If you want analog, get it. Soon, it will be a curio, rare, expensive.


NTSC: Never Twice the Same Color


Indeed, though newer circuits perform well.


Hopefully it takes interlaced video[1] with it.

[1]: https://en.wikipedia.org/wiki/Interlaced_video


The EBU made compelling reasons why 720p would be better than 1080i for HD

However 1080 is bigger than 720, so most sources are interlaced.

4K at least got ride of interlaced as an option, so 2160 is always progressive.


Unfortunately, no. The highest quality broadcast I get on ATSC is 1080i (the i=interlaced)


Even ATSC3.0 supports interlaced HEVC, from what I've read many HEVC decoders not designed for ASTC 3.0 struggle with it


wow, interlaced HEVC looks horrible. I wish it never used other than broadcast.


Meanwhile games are dabbling with rendering interlaced (or checkerboard) + reconstruction pass


One of the interesting things I've noticed is that (around here at least) the over-the-air TV is much higher quality than the local providers (cable and fiber). The local providers streams are noticeably bit starved to me, while OTA looks quite good.


> One of the interesting things I've noticed is that (around here at least) the over-the-air TV is much higher quality than the local providers (cable and fiber).

There's a substantial difference in bitrate between OTA broadcast (for the main channel) and the cable broadcast. Typical cable will stat-mux several channels down to well below 10 Mbps/channel. Main OTA tends to be around 15 Mbps (for a different encoding) but not stat-muxed. So to some extent, the quality of the channel you're watching on cable/fiber depends on what is going on in the adjacent channels.


Closer to 20 Mbps, even, for a full-bandwidth OTA broadcast: https://en.wikipedia.org/wiki/ATSC_standards.


Though most broadcast stations have sub-channels which eat some of that bandwidth. In my market, almost every station broadcasts at least 3 video channels. A few of them even manage to squeeze 6(!) into their 20 megabits (which would be fine with modern codecs, but ASTC 1.0 is stuck with MPEG-2).


What I've been surprised about is the fact that cable has been really slow at adopting newer digital codec standards.

AFAIK, pretty much all cable transmissions are MPEG2 (and analog for a long time before that). It's a little nuts considering all the benefits and readily available hardware decoders of new standards. A H.264 stream would look WAY better with far less bits.


The one serious gripe I have with ATSC is the audio handling. When you have a drop out or freeze or other glitch due to poor signal it takes out both video and audio.

For most programs most of the time an audio loss or glitch is more damaging than a video loss or glitch. If they had given more bandwidth to audio so they could use more robust ECC ATSC would have been a lot more watchable for those of us who don't have strong signals.


They have ECC in ATSC (that's part of how they detect the glitches). The issue they have is the audio codecs used in ATSC simply aren't robust, don't degrade gracefully, and require a lot of bandwidth.

The issue we have is that just now, ATSC 1.0 has become the standard everywhere, yet that was a standard from 1995. As you could imagine, we've come a LONG way in the last 26 years in terms of codec capabilities.

If we could just get people to upgrade equipment... ATSC allows for much more robust error handling but it's not used because few devices support it.


ATSC 3.0 has a new modulation scheme with more robust error correction and modern codecs, but it will take years for people to upgrade and for ATSC 1.0 to be phased out. Here's a video giving a preview of how 3.0 compares to 1.0 in real world conditions: https://www.youtube.com/watch?v=lgIm01Tsmt4


Yeah, it'll be great once it starts rolling out. It'll just take a long time.

Right now it seems like the only people deploying it are paid services (which, IMO, shouldn't be legal. We shouldn't have to share public broadcast frequencies with private corps).


I definitely agree that paid services should not be allowed to use public broadcast spectrum.


All I know is that all of the TV broadcasts in my area had very, very poor picture quality after the digital switchover, compared to what they were when analog.

The issue is clearly that the digital signals are being compressed to within an inch of their lives, since the poor quality is due to obvious compression artifacts.

So, I don't think it's a technology problem, I think it's a greed problem: the TV stations are trying to subdivide their frequency allotment so they can fit as many different video streams as possible. Either way, it's still a problem OTA broadcasts in my area still border on unwatchable.

Fortunately, that was also around the time that I stopped watching TV, so it wasn't a real problem for me (which is good, because the new digital signals literally did not come to my neighborhood -- we went from analog TV to no OTA TV at all). But it was impossible to not notice that broadcast TV quality became orders of magnitude worse, pretty much overnight.


The joke in the industry was that NTSC stands for "Never Twice the Same Color".


>If you are an American you may have heard of ATSC 3.0, perhaps by its marketing name of NextGen TV. Just like the DVB-T2 standard found in other parts of the world,

To those living in US, is ATSC 3.0 a thing yet?

It is funny it mention DVB-T2 as counter part in other parts of the world but for many part of EUR, DVB-T2 is now over a decade in use. Although the earlier DVB-T2 are stuck using AVC while Germany and France has newer DVB-T2 with HEVC variant.

Or do Americans generally dont care about Free to View TV and only use cables or any other Pay per view network?


NeverTheSameColor won't be missed


i would have thought so too...wasted alot of time trying to get digital systems to generate and recognize ntsc signals. color encoding was a disaster.

but I'm thinking now...i could at least build something from the ground up. i dont know how much machinery i would have to build to talk to an hdtv connector - maybe i would even have to apply for a key.


It was replaced with AlmostTwiceTheSameColour.


Is SECAM (System Entirely Contrary To American Methods) dead, too?


I think yes. Used in only couple of countries. Mostly former french colonies and ussr republics


And no tears were shed - at least, not by me...


slightly offtopic, but something that bugs me about OTA broadcast DTV is the program guide. (I'm in Australia which uses DVB-T, other standards may differ). The guide is so slow to load, regardless of the TV set. It also seems to require the TV to scan each channel to pickup that channel's guide data so you're left with gaps in the guide data until the TV has scanned each channel. It seems like a missed opportunity. If we're going digital, why can't guide data be instantly available for all channels?




Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: