Hacker News new | past | comments | ask | show | jobs | submit login

Interesting to see the angry comments about how Digital TV fails more than Analog.

I don't watch TV so I dunno how is the situation in my country, but I did noticed when seeing broadcasts on public TVs (for example in bars and restaurants showing sports) that compression issues happen, decoding issues happen, and the whole thing is laggy, most obvious when you are seeing sports, during analog TV times, I would hear the cheers and fireworks after goals, victories and so on, all at same time, now I hear it staggered, with seemly poorer people cheering first, I assume because they have analog TVs and radios.




I grew up with analog OTA TV. When you changed channels, the change was instant. We got a handful of local channels.

Then, we got analog cable. When you changed channels, the change was still instant, and you used the same remote as for OTA TV (if you were rich enough for a remote) because the cable just plugged straight into the back of the TV. We got a bunch of channels, including the local ones from before, a handful of new channels worth watching, and a ton of others that were absolutely worthless. This was expensive, but at least cable channels were ad-free... until they were not.

Next we got digital cable. This marked a real downturn in user experience. The signal had to be processed by a box that we had to rent instead of just plugging the cable into the TV. We needed a new box and new remote for every TV. Theoretically the new remotes could replace the old ones, but that never worked out in practice. You always needed the old remote for some function not available or not programmed properly on the new one. The boxes also took forever to switch between channels. Channels now were no longer straightforward one-channel-to-one-station mappings, but you had subchannels and HD/SD versions of the same channel. My parents never understood that while they were alive and usually ended up watching the SD channel instead of the HD because it was all they could find. In fact, for them, there was usually no increase in picture quality, just in price and inconvenience.

Finally I inherited the TVs, but I never watch them, and I cancelled the cable subscription. I got a digital TV antenna, and it sometimes picks up most of the local stations, which were all that my parents ever watched, but usually it glitches or fails. Changing channels still takes forever, but at least the antenna just plugs into the back of the TV without a rented box. Of what is available on OTA television, as with what was available on cable, there isn't enough to keep my attention. Most of it is vulgar trash riddled with advertisements.

I've gone back to reading books for entertainment. They don't require any cable or antenna, no rented box, no remote control. I can get whatever I want from the library (for free!) or Amazon, and I can change books without lag. The signal never drops or glitches. I can take my books with me anywhere, including outside. I'm happier now.


There was a little while where we still had unencrypted digital cable (in the US at least). I'm sure it's still around in some places, but we lost the option years ago where I live.

Digital cable (QAM or ClearQAM) came across the same wire as the rest of the stuff...cable internet, encrypted digital cable, etc. and you could tune it with any compatible ATSC tuner like the one built into most TVs and loads of cheap capture cards.

I always hated the "digital cable boxes" required for encrypted content, but for a while, an old PC with a capture card and Windows Media Center let me have a much nicer interface and DVR without the extra rental costs. In fact, I once lived in a building that was split into three apartments and we found that as long as one of us had cable internet, we also had ClearQAM cable TV...

...which I imagine is why they got rid of it. Cable modems need to be provisioned. Cable boxes need to be rented if you want to decrypt the channels you'd paid to authorize decryption of. But QAM was just on the wire. So easy to use and so easy to mooch, so the cable company used the confusion around the switch from OTA analog/digital switch to also remove QAM and switch everyone over to those rented decoder boxes.

Incidentally, it's also when I stopped watching cable TV. The boxes were an additional cost for a worse experience than I had with the built-in tuner or my HTPC. I thought about getting a new capture card with CableCard support, but that standard was a huge pain in the ass and never really took off outside of Tivo and similar turnkey devices. Thankfully, it's also when streaming and other OTT services started to take off.


Digital cable boxes with Firewire were a fun-but-super-tedious way for a brief few years to be able to record even digital cable content to a PC.

Now I just stream it. At least I don't have to think about it, but I don't "have" copies of it in the same way - but... the more decades go by, the less of that old stuff I recorded in 2010 that I'd ever watch again from those old hard drives anyway.


Latency always goes up with technology.

Think of streaming or video on demand services where its now up to five seconds to change.


> Latency always goes up with technology.

It's certainly possible to build non-causal signal processors for broadcast TV by using multiple tuners that buffer channel N+1 while you're watching channel N. You could do it for streaming services too by using statistics (or--dare I say it--AI) to predict what movie the user is most likely to sample next.

It might make the decoder cost an extra $2, and that's probably why nobody does it.


Seems like you'd also need a second tuner which would increase the cost a bit more.

Ultimately the problem is that with old analog signals you just needed to wait for the next V-Blank (4 hundreds of a second at the most) and you were good to go. With digital signals you need to wait for the I frame which can take dozens or hundreds of frames. And this assumes you have a good solid signal and can decode everything perfectly, if the set is trying to piece together a fragmentary signal it's even harder.


Possibly there are some trick appliable to improve channel switching performance (like include low-res I-like frame frequently?), but no one have incentive to improve zapping performance.


It does happen on computing too. Even on very recent machines, very recent OS kernels you can notice the layers. Very odd to see.


Because the developers are lazy. Why do we need 20 layers of libraries to draw a rectangle ? Give a developer a library and he will use it.


Give the developer no library and they will find a way to slam the bits into VRAM.

That said, the friction among modern devs to go from zero to cross-platform app is much lower with Electron than it is any other framework. So learn to love Electron because it is here to stay.


Ah yes, I've been without watching TV for a good 25 years now. I got Netflix on and off a few times, but most of the shows are such a brainwash that it's not worth it. Books and people, that's it for me now.


Well, ackshually...

Cable systems in the USA took advantage of more frequencies than were available over terrestrial television to get more channels. This is because the cable signal was still RF, but not subject to radio interference in any significant way because it came through a cable. The upshot of this was that unless your TV was "cable-ready", i.e., capable of tuning the additional frequencies, in order to use even analog cable you needed a set-top box. It took a few years for cable-ready TVs to hit the US market -- but by the 90s they were ubiquitous.


It is worse if you have poor signal. Digital either works and looks great or it's not watchable. An analog station might look kinda snowy but be completely watchable. I used to be able to get a few channels before the switchover on my bedroom TV with rabbit ears, after the switchover I got 1 or 2 on a good day because the signal strength was just not strong enough.


Snowy analog with intelligible sound translates to about 40dB signal to noise ratio. That's a SNR where digital will deliver crystal clear image. When digital becomes glitchy, you will absolutely be unable to watch analogue, you will have trouble maintaining vertical and horizontal sync and the sound will be drowned in noise.


There must be some other factor at work, then, because I had the same experience when I didn't have cable around 2010. For a few big sports events I tried to get broadcasts, and analog was watchable (by analog standards) when digital was unwatchable because of frequent glitching. Buying a fancy antenna did not help.

Maybe television stations did not care enough about broadcast anymore to invest in digital broadcasting capability that matched the power of their old analog broadcasts.


I bet a bunch of them did some naïve calculation that said "because digital performs better, we can reduce broadcast power by X, for a savings of $Y, without loss of range", but real-world conditions meant this actually did reduce their broadcast range a little (though mostly in low-population areas they barely cared about anyway).

This may have coincided with lower investment in broadcast, anyway. Replacing the transmitting equipment, don't care about providing as much range as the much older, analog equipment did, because the per-viewer value has dropped enough that it's not worth it to reach rural areas, so you cheap out on both equipment & broadcast strength.


There are many factors that come into this. The bandwidth of a single analog channel can accommodate 4 or more digital channels. If the transmitter is not upgraded, its power will now be spread to multiple channels.

Another is simply self-selection bias. When a rapid technical transition takes place, the average reception might improve significantly, but you can't guarantee it for everyone. A slight frequency change, a reflection pattern can drastically alter the signal intensity in a certain spot. But you will only hear from people who happened to have good analog and lost it, those with bad analog had already moved to cable and are unaware that good terrestrial digital has become available.

My experience with Freeview in the greater London area was very positive, with a very simple antenna I was able to get around 100 channels, the vast majority in perfect quality.


The people I hear from had bad analog. But you can watch bad analog; your TV is just covered in snow. Bad digital is unwatchable.


Does ATSC/NTSC work the same way as DVB-T/PAL with regards to signal quality at the same DB?


They are built upon the same basic principles - QAM modulation, error correcting codes and MPEG/HEVC codecs, so there should be no significant difference.

The newest ATSC3.0 standard has a somewhat richer toolbox allowing the optional use of 1024 or 4096QAM vs 256QAM for DVB, more advanced LDPC error correcting codes, LDM multiplexing and MIMO. The industry seems to believe the added complexity is not worth the marginal benefits.

The fundamental difference between Europe (or at least UK) and US seems to be the concern regulators had for improving over the air TV. So while in the UK, the move to digital was employed to broadly improve the offering and reception quality, in the US there was less concern and the stations found the death of analog a good opportunity to close down transmitters, recover spectrum and remove free programing.


It is. A so called digital signal is still an analog signal which must be processed. You only obtain the digital signal after you process it. The main issue here is that the digital signal is much more complex and due to this much sensible to noise. When it rains for example the analog signal will still be visible as long as the sync part can be decoded. For digital signal noise will be replaced by predicted frames. The issue is that a lot of providers have a very bad quality and they rely heavily on predicted frames. And when the key frames are not available (due to noise) good luck predicting the rest of the frames.


The difference between sync level and black level in analog TV standards is about 15% of maximum transmitter power. This means that any noise added that is more than 7.5% of max power will confuse the TV to detect sync as black and black as sync. In practice, a 20dB SNR or less will be completely unwatchable.

The Shannon limit for a 5Mhz channel with 20 dB SNR is 17 Mbit/s and latest DTV standards approach this, easily 10, maybe 15 Mbit. That's enough for a HD channel in perfect quality or many more in SD.

So the problem is not with analog vs digital, that debate is long settled. It's the specific tradeoffs made when digital TV was deployed.


Actually, thanks to error correcting codes, a digital signal can get right up against the Shannon limit. You can't improve on the Shannon limit.


Many, if not most stations switched broadcast frequency during the transition from analog to digital.


One thing I don't see noted yet in the replies -- ATSC 1.0 (the current US standard) has awful difficulties with multipath signals (reflections off of buildings, terrain, etc.)

ATSC 3.0 is supposed to fix this, but brings with it its own feature bloat.


One problem I've recently become aware of is that LTE and 5g cellular signals are overwhelming the digital TV signals by raising the noise floor in the amplifier stage even though they don't use the same frequencies.


So that may be true, in theory.

In reality, the switch from Analog to Digital means that while people in Urban areas have more clearer channels, people in rural areas (who could watch snow covered analog TV) now have fewer if any channels.

Even if a mathematical calculation says that my TV should work great, I still have nothing if I can't pick up any channels.


Even in urban areas we lost broadcasters in the transition to digital. It is just worse in a lot of ways. I don't care that my morning weather report is clearer in picture, if it stutters seemingly randomly. It's not as far as unintelligible, just fantastically irritating.


The problem is that many stations migrated from well propagating VHF to UHF with the digital migration. The higher frequency signal experiences more fade due to atmospheric conditions and ground clutter. They also jam multiple channels into the same bandwidth the single old channel used to occupy, further reducing EIRP per channel.


VHF -> UHF transition is same here in Japan. After digital TV transition, empty VHF frequencies considered to worth for something like broadcast service for mobile device, so the service(NOTTV) started. But the service is failed due to not attracting (and partially because iPhone don't support it), so now not utilized again. How VHF frequencies utilized in other countries?


I remember old "snowy" tv and absolutely hated it. Bad analog is still a really bad experience, especially with audio. Worse, it never seems to be just "snowy" but the signal goes in and out randomly so it goes from bad to unwatchable quickly and somehow always during important parts of the show. ;) I think people tend to play up idealized versions of analog tv and downplay how bad digital signal issues can be, a bit like how people hold up questionable movie special effects from the 70s and 80s as being great but also think all CGI is fake looking.

I don't think asking people to run a proper antenna is asking too much, especially for all the benefits digital brings in.


This is a good point. But having done both...

Digital if the stream desyncs it pretty much will be out for 30-40 seconds at least for me. Then repeat every few mins. With analog while the signal was bad you could in a pinch still watch it. I knew people who watched 2nd-3rd gen VHS copies of movies at that time. They were not that picky. The color was gone and the sound worbled but they were just happy to be able to watch the movie. But if it just glitched out and died they would not have bothered. It was also they just did not know better was out there. I would show people a DVD and VHS back to back on a SD TV and they suddenly 'got it' that they had been watching snow. We look back on it now and see rubbish but at the time they were just happy to get anything.


I still use an over-the-air antenna on a daily basis (and record from it using MythTV with ATSC tuners in a computer). The changeover from analog has been a bit of a give and take.

With analog broadcast, if you were a bit too far away you would still get a fuzzy image and sounds that had some interference with it. Now if you're a bit too far away the picture breaks up and the audio sounds like auto-tune. It doesn't take much of that to make it unwatchable.

On the other hand, when you get a solid signal the picture is amazing versus the standard-def of analog broadcasts. Add to this the benefit of multiple sub-channels on each broadcast channel and you get a lot more watching options.

I've installed an attic antenna and a pre-amplifier that supplies great signal to all coax (CATV) outlets in the house. That hardware cost what 1-2 months of cable TV subscription would have and that was years and years ago.

On balance the digital changeover has been much better in my experience.


> On the other hand, when you get a solid signal the picture is amazing versus the standard-def of analog broadcasts. Add to this the benefit of multiple sub-channels on each broadcast channel and you get a lot more watching options.

Subchannels is kind of nice, but junky content with junky compression is worse than a more limited set of snowy channels. Now, when they do broadcast stuff with enough bandwidth, and you're above the cliff the whole time, it's nice.


Yeah, I think it’s a little ironic that the analog error by necessity had lower latency because buffers were expensive. In someways the world as a whole had these brief experiences of synchronized emotion… everyone hearing the same exact thing within a second that with a long digital buffers caused by cheap memory we no longer experience… buffers can be several seconds, maybe over a minute in total even for live events. Buffer bloat killed cybernetic synchronicity. ;)


> Buffer bloat killed cybernetic synchronicity.

There's something very interesting about this concept of "cyber-sync" - when is now? Is your now different from mine?


The differences in latency from the live event to people seeing it were pretty apparent in the UK during the recent Euro 2020 tournament - I was surprised how many were evidently watching on internet streaming services by the length of the delays in reaction vs what I saw via satellite (the delay for satellite vs terrestrial/cable is small by comparison, especially since both of those went digital). If you want the lowest latency to the live event, analogue radio is probably still the best for these types of things.


The BBC 4K streams were a good 70 seconds behind broadcast. Fastest/earliest stream is SD on Freeview from my testing.


I remember the standard delay for Analog or Satellite Broadcast were 3 - 5 secs due to some betting arrangement. So I really dont see why Internet Broadcasting cant be within that limit as well.


It probably has to do with the failure mode rather than actual rate of failure, although it may be perceived differently.

When digital signals "fail" (usually just minor interference or some power fluctuation), you usually just get a black screen or a picture that cuts in and out. When analog signals "failed" (in the same interference or fluctuation) then you likely got a distorted image/sound/colour for a few seconds but it didn't really interrupt your viewing experience.

It is an interesting observation that it seems in areas where there was still analog and digital transmission in parallel, that the "live" analog signal was "faster".


I'm not so sure what kind of error-correction ATSC has compared to DTB and ISTB standards, but the 8VSB modulation is not particularly great

Also ISTB has the idea of a 'mobile' channel, which can be useful in low SNR conditions (it is worse for pretty much anything else, though)


ATSC has only one FEC parameter compared to DVB-T and ISDB-T, which also have more robust modulation schemes. ISDB-T includes an interleaver that makes it more resilient to impulsive noise compared to DVB-T.

ISDB-T modulation scheme is very interesting. It is comprised of 13 sub-carriers ("segments"), and by spec, one of those can be the "mobile" channel (called 1seg, as it occupies one of the 13 segments and leaves 12 for the main mux). It can have different modulation parameters from the 12 main sub-carriers, so you can make it more robust with less available bandwidth and make the main mux less robust but with more bandwidth.

IIRC there are other ways to distribute the ISDB-T segments but I don't remember how out of spec it would be. Most 13-segment receivers (ie, TVs, set-top boxes, some USB adapters) wouldn't care (at least 10 years ago they didn't care) and it would work, though.


Partial success of signal mattered in rural emergencies, but with ubiquitous cell phones the need disappeared. With improving internet coverage providing on demand media, the need for over the air television is shrinking and the spectrum can be better used.


And there's still best broadcast method for emergency: analog radio




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: