2. Sly TVs never get Internet access in my house. I don't plug Ethernet into them, and I don't set configure their wifi. If I must give one access because it needs a firmware update to fix a bug or something, I temporarily set my guest wifi network's password to something random, configure the TV to use that, apply the fix, then reset the guest wifi password back to the old value.
3. I trust literally any other video device in my house more than a sly TV. While we use Apple TVs, there are plenty of excellent alternatives that have reputations for being decent about privacy. Apple hasn't been sued for spying on my TV watching; Vizio has. The FBI hasn't warned me about Apple.
4. All sly TV OSes are junk with terrible apps. Seriously, slap an Apple TV (or Roku or Fire Stick or ...) in the HDMI jack and your viewing experience will be better in every way. I'm not suggesting you make your TV worse, but to make it much, much nicer.
I don't want to come across as paranoid, but sly TVs have a history of one headline after another of reporting your viewing habits, injecting advertising, and otherwise bringing spyware into your living room. This isn't some big new explosive revelation, it's a normal Tuesday in the news. Don't put up with it! There are much better alternatives.
I call them telescreens, comrade.
Why have a seperate device and two remotes when you don't have to? I'm convinced that the marketshare for these external media devices will flatline and steadily decline. In the case of Apple, the only way they can try to compete in the future is releasing their own standalone TV panel. I suppose they saw the writing on the wall which is why AirPlay and Apple TV are now available on Samsungs TV OS
It's been shown time and again that developer support of past products is atrocious. You just can't count on getting your apps updated through the whole lifecycle of the device. This is partly understandable, as older hardware just isn't going to be able to handle some things (say, H.265). But even so, the developers seem to be doing little more than paying lip service.
The best strategy is to have a dumb monitor, and a device like Roku that you can upgrade very cheaply every year or two.
That's more a criticism of "crappy Android boxes" rather than a praise of builtin TV applications. The latter are still barely usable and quickly become obsolete (with updates stopping 1-2 years down the line), compare that to a good Android box like Nvidia Shield.
> Why have a seperate device and two remotes when you don't have to? I'm convinced that the marketshare for these external media devices will flatline and steadily decline.
You may be right long term but it's been years since we've had "Smart TVs" and they are still far from being as good as decent media boxes. That to me means there's an underlying reason, probably a misalignment of incentives.
And btw, depending on your needs you can use a single remote just fine (I use an Fire TV stick bluetooth remote, connected to the Nvidia Shield, for controlling the Shield, the Sony TV and the Yamaha amplifier through HDMI-CEC). With such a combination of different parts from different manufacturers I'm surprised how well it all works.
Sorry, but this is just no longer the case. Even my neighbour doesn't turn on his Nvidia shield anymore because his Samsung TV OS is good enough and the Youtube/Netflix apps are fantastic. Samsungs OS is where the dominant marketshare is so the apps are great.
I think it'll be a long time before any TV has a native app store as decent as Apple's (and probably Amazon's, but I don't have personal experience with that so can't vouch for it).
And as others have pointed out, you're pretty much stuck with what it ships with. If my Apple TV breaks or gets old and unsupported, I can trivially switch to a Roku. If the SoC in my TV gets old and unsupported, there isn't jack I can reasonably do about it other than throw out the whole TV and get a new one.
I use USB based updating for the TV to avoid ever having to give it Internet access. Since it's never connected to the Internet I don't need to care much if it's running the latest version or not, for all I care it could be running the same version until it stops working as long as the current version is good enough.
My TV runs RokuOS. I've recommended it to everyone I know in the TV market and none have been disappointed so far. They treat the 'smart tv' aspects as first class rather than a buggy afterthought. I'm much more optimistic about it being updated for a long time as well.
Are there any negatives to doing this? I'm running into issues with a Vizio soundbar working with a Vizio V Series tv and was contemplating connecting to the internet to see if there's any updates but was holding out on finding a fix. I may have to try your method though.
I'm with you. The challenge is or is going to be when (unless they already do) TVs start embedding 4G/5G SIM cards into the TV to bypass such requirements.
My only hope is that enough people use the Smart TVs as designed, with Internet configured, that TV manufacturers never see ROI in going the SIM card route. fingers crossed
Are the cellular providers going to give free cellular access to these manufacturers or something? Consumers aren't going to pay for that. No one's going to buy a TV that requires an additional $30/month for a data plan.
Today that revenue subsidizes the cost of the TV itself, the additional cost of 4G/5G based service for small bits of data is not much. Other options can be LoRaWAN, NB-IoT or other such services since the data transferred is going to be small, primarily analytics. All of these are pretty cheap both to embed in the TV as well as the service, especially at scale.
I actually wouldn't be surprised in the least if this became commonplace down the road. Just open up your new telescreen and bam it seemlessly connects to the 5G/wireless network without issue. You'll get every update, ad, tracking, software automatically pushed to the machine without the need for the user to intervene.
*lifetime - defined as "lifetime of the device", whatever that means. 5 years in and it still works fine.
However, I must point out that a satnav device's occasional updates, as I said, don't use much data: just a multi-megabyte download every few months at most. But streaming constant surveillance data from a TV to the "mother ship" is going to require far more data than that. Multiply this by an average of 1-3 TVs per household, and this seems like a big load on the cellular system.
Did you mean wrap the antenna or something like that? I don't have radio equipment but would love to know more.
Note: that's expensive AF. I probably wouldn't literally use it to block the signal when a thin steel piece would probably work just as well. But it's an example of something you absolutely could do: find out where the antenna is, then severely interrupt its signal.
Never mind that all backdoors work more or less the same.
I.e., they have a list of TV models that they know how to disable network connectivity without causing any other problems. You order from them, and for some reasonable fee they'll make the change, verify everything is okay, and ship you the unit.
Personally, I'd be willing to pay a $100-200 premium for that, especially for a TV that (pre-lobotomy) sells for over $600.
The FBI issuing a warning about this is helpful. It may help create a market for dumb TVs that cost more than smart TVs.
Yeah, this is like how a laptop can be sold cheaper with Windows OS than with Linux: the Windows version is preloaded with a ton of crapware that the mfgr gets paid to preload, so even with the enormous (as a fraction of BOM) price of the Windows license, the laptop comes out cheaper.
So a smart TV is nothing more than an LCD with Arm SoC. Of course that SoC it's completely locked down and there are no specs outside of NDA's.
It got me thinking, how hard would it be to go to a big manufacturer that makes these smart TV's for chains like best buy and contract them to make the same set without the proprietary garbage hardware board? Instead a board built around an open SoC and software ecosystem is installed so the ethernet and USB ports do what they're supposed to do. That or pick a very common TV model, figure out the LCD interface and build a retrofit board. All of this is easier said than done.
Then imagine how fun it would be to hack your TV to do whatever because it's a regular computer? Of course the big issue that stands in the way is the HDMI and DRM burdens.
(cue someone replying with: just don't connect the smart TV to the internet and hook up a raspberry pi.)
How "modern" was this TV? Don't they all use LED backlights now? Backlight inverters should be obsolete now.
Is it just that they're low-volume SKUs? Or are the somewhat more ruggedized than consumer models?
(I have no evidence for this but I've seen this comment posted many times before)
It goes into how running content recognizition software against whatever's on the TV display lets them collect and aggregate data to sell, which helps them sell the TV for very low prices without having to worry about being profitable on the initial sale thanks to data and other monetization streams. Sony, Samsung, and the other major TV players all run very similar Automatic Content Recognition systems.
Although what's insulting is when the super premium tvs that STILL include all the same tracking junk. A 3k dollar+ tv shouldn't need any subsidizing at all, yet here we are.
Many of them have integrated signage software too like MagicInfo and have video wall capabilities etc.
When I looked a larger high end 4K consumer TV was cheaper than a smaller 1080P commercial display.
Unless you run Netflix in MS Edge (or the netflix windows store app, which wraps MS Edge) and use a Skylake or newer CPU with a secure enclave / AMD PSP, Netflix won't stream you their highest-quality content.
Or they will happily deliver it to whatever Android device or smart TV that you like.
(also, PC support for HDR content is... "lacking", shall we say)
The fact that they're not an advertising company? That their business model doesn't depend on monetizing your personal data? That being privacy-oriented is a distinguishing factor compared with their competition?
Most TVs include speakers. Even the non-smart ones. By adding networking and apps, that TV can be used stand-alone.
For people starting out with their first home theater setup, they can get started with just the TV. With a dumb TV, they'd need the TV, and a receiver/speakers setup or a soundbar, and something to provide content like a Blu-ray player or cable box or Roku.
By making the TV smart, the TV maker has a better chance of having people start out with a higher end stand-alone TV and then adding other components later. I presume that the TV maker would rather have the consumer spend all of their initial budget on the TV, instead of splitting it between TV, receiver, and Blu-ray player.
For people with existing home theater setups, a similar thing goes on with upgrades. For example, I have a Denon AVR-1913 receiver, a Comcast X1 DVR, and a Sony Blu-ray player, and a Samsung TV.
It was a smart TV but I never used it as anything other than a dumb monitor. I used apps, not but on the TV. The Blu-ray player had a full set of streaming apps, and so does the X1. The Denon receiver has an internet radio app, and also supports Spotify Connect, and supports Airplay. So there was never any need to use an app on the TV. (Also, when I tried it once out of curiosity I could not get ARC to work to send sound back to the Denon receiver).
But then the TV died. I ended up getting another Samsung, but this new one is 4K UHD.
None of my other components are 4K. I didn't want to upgrade the rest to 4K at that time, so for things I wanted to watch in 4K I used apps on the TV .
With dumb TVs, I probably would not have bought a 4K TV. I would have waited until my other components were 4K, and that could have taken quite a while. I tend to keep receivers until they die.
In summary, with smart TVs, the TV can meet the needs of a wider class of consumer. As long as they don't make it so you have to give the TV network access or use its smart features, it can meet the needs of almost every class of consumer. That's attractive to the manufacturers.
 ...which meant that I did need to figure out ARC. As with the old Samsung, it just did not seem to want to work. Then I noticed something in the Denon manual:
> When the ARC function is used, connect a device with a "Standard HDMI cable with Ethernet" or "High Speed HDMI cable with Ethernet" for HDMI.
I made sure to use such a cable...and it worked, which makes no sense. It would make sense if we were talking eARC, but with plain ARC--which is all my 6 or 7 years old Denon supports--it should work with any regular old HDMI cable.
Get a decent set of network hardware so you can monitor everything that’s going on. I can recommended UniFi by Ubiquiti, despite the recent blow back about crash reports being sent to them.
Then we'll see the manufacturers include SIM cards, at least in countries where it's affordable.
A quick google search doesn’t show any Sony TV with this capability.
Can you give me an example of any consumer TV that supports Ethernet over HDMI?
When I asked about data sharing with these dvr boxes and the new streaming stick option, only thing they could tell me is that 'virtually all video decoding and streaming devices are sharing data with multiple third parties'
I returned the dvr box, and have not chosen to use their new streaming stick.
If your neighbor, for example, has a syndicated WIFI like Fios or Xfinity, your TV can try those. Also, it has 24/7 free time to sit there and break passwords, if needed. Even a slow CPU would score some eventually. And finally, the room is probably full of other devices which themselves may have connections; a TV can communicate ultrasonically with those to exfiltrate your data.
It would be orders of magnitude cheaper and safer to build in 4G connectivity and use it to upload the relatively small payloads.
I wonder how many already have.
Are we going to need a spectrum analyzer to watch this junk?
A big selling point (for better or worse) is not requiring you to buy more boxes or hook up wires. Just as you see with cheap IP cameras which neglect to steer users toward secure setup and only tout the "easy wireless setup! Just point to the picture on your iPhone screen!" It's the flip side of the ubiquitous marketing bullet points: Easy! No wires! Apps!
I fall into this camp - I really like using the physical remote w/physical buttons to play/pause/rewind, change volume, etc. I used to be a heavy chromecast user but switched for these reasons.
Furthermore, if its using 5G then its not tied to the IP address that my other traffic comes from and so they can't use the TV to target me with ads online either. (Provided I don't use any of the same accounts on both the TV and my other devices.)
And, that's before you consider that they can also scan local wifi and bluetooth spectra to glean even more information about their local environments. With participation of vendor applications on phones, they could also use ultrasonics or crazier things like modulated LED lighting.
Finally, if one of those devices is compromised, it can then be used to launch local attacks via wifi and bluetooth, or even potentially those other covert channels if they can exploit sloppy code in those other dark apps...
Taking those things into account as well as the things paulmd pointed out in a sibling comment, I am no longer fine with the idea of 5G in my TV.
Also, virtually any device with cell connectivity includes a GPS on the same chipset so it's pretty trivial to connect that with google location data. So if it has 5G it has your location it has your identity.
Not on your LAN but still harvesting your usage data (a feature!), and still hackable (smile for the camera!).
I don't know of any TVs that come with a camera built into them, and if there are any I certainly wouldn't buy one of those.
Edit: I guess I was mistaken. They were not caught doing that yet.
Zeek (or wireshark) can revel what domains and ip addresses are being accessed. Pihole and iptables can block those addresses. Pihole needs to be the only DNS allowed and iptables needs to be running in between the TV and its path to the internet.
>Perhaps the best April Fools’ joke of the (Youth) Journal [a Dutch news program] came in 1969. Special cars would drive around with scanners to track down television owners who hadn’t paid the license fee [it’s like in the UK, where you have to pay a fee to the BBC]. But if you wrapped the TV in aluminum foil, you could fool the scanners. Before long there was no foil to be found in stores. http://business.time.com/2007/08/21/why_dutch_people_wrapped...
Also, side benefit of no large black rectangle :)
I'm looking to add a TV, large computer monitor, or similar to our living room for PC gaming, XBox gaming, and movies.
I assumed that a projector would be washed out whenever the room's lights were on. Which would be a hassle for people who wanted to be in the living room at the time but were doing other activities.
I went with an Epson 1060 (which is a fairly budget option) and it's freaking awesome. I was afraid it would be dim in full sunlight, but it's not a problem at all. My screen is next to two windows and even in full daylight with the lights on it is super bright and visible even while in eco mode (which is dimmer than regular mode). Fully replaced my TV and at a much much lower price point than the biggest possible LCD. Also, replacement bulbs are ~$60 bucks and I have yet to replace one after ~2 years.
Added bonus is the option to install a powered retractable screen, which make the TV all but disappear from the room when not in use.
When the room is dark it's absolutely amazing, like being at the movies. When lights are on it is quite washed out. Still usable but not as enjoyable. Since the projector sits in my bedroom it's not much of an issue to always have the lights off but in a living room I can see that being a hassle.
Once you get the projector and screen (they do make a difference) I could have gotten several (cheaper) modern TV's but the experience of the projector is what I love. One thing as a new buyer I wish I knew was how much heat projectors can generate. My bedroom is fairly small and this thing can really heat the room up to sometimes an uncomfortable level after long viewing sessions.
That thing used to be 7000+ CAD. Even more when they first came out. In between legs at school I sold them in Calgary.
I’ve had to replace the bulb once in about 4 years. Or rather the timer went off to warn me and I replaced it — bulb was still working though. New lamp was fairly cheap too IIRC, $40, maybe less?
And I can take it with me, or outside in my backyard for a movie night. I probably won’t ever go back to smart or dumb tvs
It is fun having a >100” screen with a laser projector. I haven’t seen any of the new ones in person, but plan to get one some day.
I throw cult movie nights and recently ditched the projector for a 42” screen - by the time there were 15-20 people in my living room, the heat of the projector made us keep windows open in Canadian winter.
Not to mention the usability issues during daylight, which make a projector (there’s a lot of light in my living room) mostly useless 50% of the time.
The large black rectangle has its own use - it clearly defines where the screen is occupying space, something you’ve gotta configure for a projector.
Unless you are privileged af enough to have a room you can dedicate to a home theatre setup - (in which case, good for you, housing/rent prices here in Toronto make that out of reach for me even as a low triple digit salary individual, four times higher than most of my similarly aged friends) - a projector’s disadvantages far outweigh its benefits.
Even if visibility wasn’t a glaring issue, the heat is practically unbearable. Worst idea I’d had was to try it in my bedroom as a TV replacement. :P
I went through several models of projectors in an attempt to see if any of them would be tolerable, but no.
Large TV’s emit heat as it is; but it’s really like a small heater in a room on low, instead of an industrial scale heater.
That doesn't work when the manufacturer of the smart TV you got 3 years ago stopped providing updates 4 years ago. Then your nephew found a list of all the vulnerabilities it has.
For now, I've settled for buying a smart TV and not connecting it to the Internet. Until they come up with a way to phone home without my Wi-Fi, I think I should be safe from spying and vulnerabilities..
There's no presumption, this is known.
I've seen several reports that some TVs will find and connect to an open Wi-Fi if you don't set them up, and then phone home that way.
Probably better to connect it to your Wi-Fi, but block its access to WAN. Or open the TV and remove the module entirely.
I'd also argue that the built-in apps on "smart" TVs increase the manufacturer's profit margins rather than reduce costs for consumers. Sure there are some brands/models that compete on affordability, but others don't need to. After all, did the price of the iPhone go down as Apple scaled up and reduced the cost of manufacturing? No. They just started making more money off of it.
At the same time, folks all over the internet like to mention finding dumb TVs but never provide links, model names, or even just brand names. Why? I've looked pretty hard and can't find any actual verified dumb TVs that are actually for sale anywhere. The top comment in this post at the time of this writing directly mentions finding a buying a smart TV, but provides no further details.
For the one smart tv I do have, I just don't have it connected to the internet. Even then, I have no way of knowing if it's connecting to any open wifi networks near me.
https://www.samsung.com/us/business/products/displays/ under "4K UHD Displays"
They do provide some solid advice, like checking settings and making sure it gets updated. It's probably good that they send out warnings like these, most people probably don't get any thought to any IoT stuff at all. I don't know how big a target TVs are in reality, but it can't hurt to get people thinking about these things.
Remember when Samsung warned about having conversations around TV sets...
I'd imagine Verizon or ATT wouldn't charge too much since they'd be sending logs, voice snippets, and maybe a few images. If the TV companies offered to share the data, they could probably get cell companies to do it for free.
I'm surprised you managed to find one. They seem quite rare these days outside of some budget brands that don't bother due to cost.
You can see a hardware comparison chart at https://developer.roku.com/docs/specs/hardware.md -- Liberty was the platform name of that first generation of Roku TVs.
I would just entirely disable WiFi on the thing, but guess what? Once you set the wireless up, there's no way to erase that data from the TV again. But allowing the TV to be on the network but not on the internet allows me to use some IoT features locally, so it's an alright compromise by me for now.
I would buy a dumb TV in a heartbeat, next time I upgrade. If there's one available.
>I would buy a dumb TV in a heartbeat, next time I upgrade. If there's one available.
There are none that I could find when I bought my TV some months ago. I hope that eventually some anonymous heroes start flashing open source OSs to their smart TVs and make us owners of our hardware again.
I guess if we were talking about possible features rather than actual one's: the ability to use the tv's tuner to stream to other devices would be another intranet feature that could be on someone's wishlist.
I agree the equivalent of cyongenmod/LineageOS for smart tv's would be great.
Just don't connect the network - use only HDMI. This will turn a "smart TV" into a "dump TV".
Can you provide more context?
If you can provide an example of a receiver that does this that would help. I'd like to see actual usage of this feature before I start worrying about attack vectors. From what I can find nobody has implemented it.
Maybe someone could even put together a repository of diagrams to show which areas need to be covered.
Would be a fun to see for us paranoid people.
Plus, tin foil hats can actually improve signal reception around the 2.5GHz range used by WiFi :P
The only real annoyance (aside from installing a huge faraday cage throughout your home) would be the need for a way to relay cell communications via your hardwired ingress/egress paths. Or, you simply designate your home a cell-free zone and walk outside when you wish to receive your SMS payloads or make a phone call. WiFi calling is probably a good solution for a lot of people too.
I'm assuming the disaster is being bought by Google. What is needed is some sort of regulation that prevents the new owner to use the newly acquired data in a way that the user did not originally agree to. If the original agreement stated that the data would not be sold, then the new owner just can't go off and sell the data. This is what the regulation needs to protect, as well as the new company cannot retrospectively remove the option either.
Sounds paranoid, but so did everything else until now.
It's an "ok" TV. The picture is pretty decent, but not great. The sound is fine. The refresh rate is usually ok, but isn't suitable for fast paced games.
Of course computers can be broken into as well but at least those are the beasts I know how to maintain.
Was there actually ever a TV update that changed anything significant for the better in the mode I describe above? Its an honest question, since I never did it.
While not literally taking video and recordings of you, it's still problematic.
It's getting harder to escape the surveillance. Quality of life will increasingly diverge from the main stream.
The reality is that as long as there is any profit to be made from selling them, corners will be cut and people by trying to save will end up with smart appliances that end up being windows to unknown actors.
In this scenario I find it way more comforting for big corporations and governments to have access over individuals who could target specific people. Abstinence seems like the only solution from an individuals POV.
Not even new devices are kept secure, high-end Sony TVs manufactured in 2019 are still on the Android security patch level of June 1, 2019. No new updates are available from Sony, despite multiple high-severity security issues being publicly documented and ready to exploit.
And more importantly, nobody is liable when security updates are not delivered and customers are exploited using public CVEs.
To rest easier, I actually banned it’s MAC from accessing my network. Even that is probably not sufficient.
All of my viewing is through my Apple TV, which in my opinion is the most privacy-centered company in silicone valley.
I'm curious why that is (possibly) not sufficient? Anyone?
you beat this by setting a MAC and then doing macsec ( swithcport security by MAC address ) on that switch port the TV is connected to.
For video games, you can still buy 4K monitors and play them on that.
Another option connect it to a dedicated subnet which cannot route to the internet or another device.
Yes! Like almost all other connected devices we are using. (Computers, Phones, Routers, DSL Modems, WiFi Access Points, Smart speakers, Cars, Modern cameras, ...)
Plus, you can get bigger screensizes. Is there something I'm missing apart from the bulb hours?
video chat (many models had Skype a while back, as far as I remember that was shut down though)
Everyone knows that news articles can be selective in how they choose to report, and that therefore they should rely on multiple sources. This is bad but not a fatal error. If a news agency completely invents quotes that destroys the fundamental trust that makes it possible to build on flawed sourcing.
> The Press Complaints Commission has upheld a complaint against the Daily Mail
> The [incorrect] article ...was live for 90 seconds
> [The incorrect article] included quotes attributed to the prosecutors apparently reacting to the guilty verdict, and the description of the reaction in the courtroom to the news, stating that Knox "sank into her chair sobbing uncontrollably while her family and friends hugged each other in tears"
> It further stated that the family of Meredith Kercher "remained expressionless, staring straight ahead, glancing over just once at the distraught Knox family".
> The newspaper apologised for the mistake. It said that it was standard practice in such high-profile cases for two alternative stories (plus supporting quotes) to be prepared in advance
> It had published an online apology and explanation to readers; published the correct verdict in print the following day; launched an immediate internal inquiry (and subsequently changed its practices regarding such 'set and hold' stories); and also disciplined the person responsible for the error.
> Although the PCC recognised that the newspaper had acted swiftly and proportionately to correct the breach of the editors' code - and acknowledged that the story had only been live for a short period of time - it nonetheless remained "particularly concerned" about other aspects of the report, most particularly the fictitious account of what had happened in the courtroom.
> The attempt to present contemporaneous reporting of events in such a manner was "clearly not acceptable".
"IN EVENT OF MOON DISASTER": https://www.archives.gov/files/presidential-libraries/events...
The issue is that people have accused them of completely fabricating quotes for decades, and this was conclusive proof of the practice because the quotes in it could never have been said. This would be if the in case of moon disaster speech included a statement from the widow of one of the astronauts saying it was all worth it for science.
Yeah, I'd agree with that. I'd say the same about literally any news outlet on the planet, though.
If it's an interesting topic find a trustworthy source to post.