TVs should be incredibly dumb. They should be screens for displaying stuff. That's it. Nothing else.
No network connection of any kind, no apps, no software beyond that necessary to do basic setup of how that screen works: brightness, input selection, etc.
A TV is not a monitor. it probably has an antenna connector, and because TV signals are digital nowadays, it has to decode the radio signal and support a variety of compression algorithms. Most TVs also have all sorts of image processing algorithms. At least a scaler, but most have other, sometimes questionable features. Plus overlays, recording, picture-in-picture, or whatever the manufacturer thinks will improve sales... TVs are already "smart" to begin with. You can just thing of internet connectivity as the antenna connector of modern times.
Thankfully, I don't know of any TV that requires internet access yet. AFAIK, they all can be used as dumb monitors. If you really want a dumb monitor, you can either buy a PC monitor or a commercial one, like those used in shops to display ads and stuff like that. Some are even as dumb as a monitor can be (like the one I am using right now), with only a single input, a single resolution, no OSD, and only an on-off switch and two buttons to control brightness (which is just a dimmer for the backlight).
But dumb monitor tend do be more expensive. First, they tend to be of higher quality, and second, "smart" features are often a profit for the manufacturer. For example, if your TV supports Netflix, Netflix most likely paid for it, and the amount is most likely more than what it cost the manufacturer to implement that feature. It results in a lower purchase price for the end user, the idea being that whatever partner will make up in subscriptions.
I own several smart TVs but use none of the smart functionality. They don’t even have internet.
TV makers are horrible software developers, for the same reasons that mobile manufacturers were horrible mobile OS developers until Android came along.
We have similar solutions for televisions in the form of Apple TV and Chromecast.
I honestly feel I’m wasting my money when I’m buying a smart TV these days, but you just narrow your choice so much if you only look for dumb phones, you’re almost forced into buying one.
While it could be that your usage data is collected and worth that much to them; the far simpler reason is simply both how competitive the "normal" TV market is, and how much you benefit from economies of scale. Even if your usage were worthless, a smart TV would likely be considerably cheaper.
Also, don't overestimate the bill of materials for a smart TV. Even a "dumb" TV almost certainly has silicon brain simply for controlling the settings UI and various other functions - that's cheaper than buttons, and remotes need something to talk to, so really, a smart TV simply means "a slightly fancier chip" - but still a chip that's several generations old by smartphone standards. It's not going to be a significant extra investment for the producer.
You also need to know the harm or benefit you accrue by paying for services and being shown ads, as well as the benefit to paying a lower cost for the hardware upfront.
My only problem with LG is that it doesn't seem to continuously poll/maintain a connection to SmartThings, so the Google Home-based control will never work and it can take 15-30 seconds before you can control the TV via the official app. Maybe this is because it's hard-wired instead of on wi-fi.
I honestly cannot believe that there is no option anywhere in the settings to turn that off. The best you can do is make the pointer smaller so it is slightly less intrusive when it randomly pops up. I’m tempted to just pull out the batteries, since I can turn on the TV with the Roku remote and I already have a separate remote for my sound system volume.
Yes, I HATE that. My other big annoyance is when i have a movie paused and set down a drink on the table next to the remote and the remote wakes up due to the vibration and unpauses the movie..
Yeah a physical button to turn it on-off would be amazing. Maybe I can sacrifice a remote and see if I can just add a switch to the sensor. Not my worst idea.
Both LG webOS and Sony Android TV come across as polished. If not for the fact that I hate advertisement and tracking with a vengeance, I'd gladly use them.
I'm surprised to say but the latest Samsung interface and remote is quite good. Good enough I feel no need to put an Apple TV on it (which happens to have an awful remote).
Commercial signage tvs tend to be much more expensive and never really go on sale though. I can't even find out the price of this product without calling them to talk to a salesperson?
> They’re more expensive because they need to make up the lost revenue from advertising
If that is the case, shouldn't the product then be labeled that part of the price is subsidized by advertisement so that in a free market customers can make informed decisions about what products they buy and under what conditions? Otherwise there is a huge risk that manufacturers that add such hidden drawbacks can unfairly out compete others who do not, we get a lemon market.
Exactly, the problem is that market research clearly shows that most americans aren't ad-conscious enough to pay $50-$100 extra to not have an ad being the first thing they see when their TV turns on.
> Of course, there’s also the simple thing of “enterprise stuff costs more for no reason.”
That "no reason" is usually a combination of the following:
1) Support avenues - commercial customers want rapid support in case something breaks, including overnight / on-site repair. That infrastructure costs more money compared to consumer appliances where the customers have to ship stuff to a central repair place.
2) Quality. Consumers are used to stuff failing after 3 years and get the next new hot thing, commercial customers want a decade or more in life span with as few maintenance calls as possible - and especially they don't want to redesign enclosures when the model is no longer available so they demand longer shipping times - again, on the order of 10 years or more. Also, these displays generally have to work in a wide variety of environments - directly in sunlight/heat/cold, vibrating/otherwise moving. Higher quality components cost a lot more money. Add more money for certifications required for medical or military deployments.
3) Spare parts. Again, the longer availability terms mean more costs for the support infrastructure - while for ordinary TVs the parts stock can be emptied out after 3-5 years, stock has to be kept around for way longer for commercial TVs, and that includes buying up spare parts when a supplier EOLs a part.
4) Features. Commercial TVs tend to have more selection of (rare) inputs, e.g. BNC or SDI (the latter to drive an array of screens around a spread-out location from a single signal source, you can't do that with HDMI).
5) Firmware. After three to four years no manufacturer except Apple gives a flying f..k about the firmware, which means security holes go unpatched. Commercial customers demand longer update cycles (and better validated ones), again that costs more money.
6) Vandalism and elements protection. This one is huge and ties into the quality part. While your home TV won't need to be protected much against anything, vandals will go and attack anything without mercy - with anything from graffiti over hammer blows to hydrofluoric acid. Add to that nature: bird crap, vomit, tree sap, pollen, drunkards stumbling into your digital signage... or humidity/harsh rain.
7) Loss of revenue from advertising, as you mentioned.
Or because they don’t want to give you a reasonable choice but to send all your viewing data to them. Comparing prices of dumb TVs a few years ago to these “enterprise” ones, it seems unlikely that the premium is to make up for lost ad revenue.
Another possible factor is that enterprises are willing (or even required) to spend more money to keep their data private and secure.
Yeah, I think its primarily market segmentation. Or, you can charge these people more money because they have it.
They probably segment based on durability guarantees, these tv's maybe are slightly more durable / last longer in weird temperatures because consumer tv's can cut corners to make the sale, but commercial applications want something that lasts a long time and make fewer sales.
Yeah, and if you're lucky enough to find a seller that way, 9 times out of 10 it will be a wholesaler who will only sell you stuff if you have a corporate account, meaning you either own a business yourself or are cleared by your boss to purchase stuff.
Add to that the ludicrous shipping costs for anything weighing more than a box of cereal, insane import duties for anything worth more than say $50, plus a fat VAT slapped on top of all that, and most people in the rest of the world* will probably think twice about it.
TL;DR buying enterprise grade gear of any substance from across the globe as a consumer is a little bit more involved than ordering a pair of slippers frome some seller on AliExpress.
I was able to do this with my TCL/Roku TV a few months back. My curiosity got the best of me with the announcement of their latest OS, so I upgraded to see what the new features were like. Now I am stuck in the ecosystem, as the downgrade feature is completely disabled (even from a USB stick, with no internet connection, after a factory reset). Luckily, I use a Pihole as my DNS on my router, but still. Super annoying.
That's a valid point, dumb TVs need to be kept dumb by not updating/upgrading the OS although this results in conundrum of not patching existing vulnerabilities in favour of not introducing new vulnerabilities? Vulnerabilities of dumb TVs could still be exploited by plugging in USB devices/SD card or even STB like Roku.
SNMP is a monitoring protocol and WoL is a great way to automatically control your screens. You can’t use WoL outside of the local network and SNMP is configurable and secure (if it’s at least v2). It’s not something you should be worried about.
I've asked this in other similar posts: how do you even buy one of those?
I went to the equivalent LG website for my country, and they don't even have a button for contacting them like in the link above.
So I did some searching for the model "LG UT640S" and I only found it sold on one website in my country, but listed as a smart TV (full number: "LG 43UT640S0ZA", parent ends in S0UA).
What gives? Do you need a company to be able to buy one of those? Maybe a sort of line of credit with LG? Buy in bulk? None of this makes sense honestly. I wanna buy a product that clearly exists in the world, but I can't even find someone, somewhere, that will sell it to me.
That would essentially force people to buy two devices - one for receiving content to display and a second to display it. All your solution does is shift the problem on to the receiving device. It wouldn't fix anything, and at the same time it'd increase the cost and complexity for the user (by a very small amount admittedly).
Yes, and there are a lot of good reasons for that! The two functions become obsolete at different rates, and people have different needs for each of them, so it's the right place to put an abstraction layer.
When I buy a computer monitor, I can pair it with any computer I want. I can upgrade the graphics card or processor independently of the monitor. I'm not locked in to a particular computer based on the features of the monitor I want. TVs should be the same. It could be as simple as a USB stick you plug in the back.
>The two functions become obsolete at different rates
The implication here is that the connectivity tech goes obsolete quicker than screen tech. That has traditionally been true, but has it been over the last 5 years or so? We don't even have to get into the specific display technology. Just from a feature perspective, there has been a lot of innovation in screens including 4k, HDR, and high refresh rates.
And the dominant broadcast standards are still 1080i 60Hz with no extra dynamic range. A large amount of new content is still targeting that. Aside from over-the-air, I suspect that most cable/satellite operators are offering that service level, at least without premium fees.
Since the end of the digital-TV transition, the TV industry has been throwing a lot of stuff at the wall trying to find something that stuck. There's no clear "it becomes a paperweight" factor to make us all go out and replace newish sets right now. Remember the 3D TV trend? Or the year when everyone brought out curved sets, and then went back to flat? Smart TVs are another variation on that theme, with the added benefit for manufacturers that their lowest-bid tech and changing third-party service requirements will leave you with a set where half the hard-coded service buttons don't work and the other half are unusably slow to depress you into buying a new set in three years.
I expect the next real ecosystem change will be when ATSC 3.0 becomes a workable thing. Then you'll actually be able to offer 4K/HDR with an array of content without the caveat of "external game console/PC/streaming service required." I'd be a bit hesitant to get a new set until then just out of the risk you ended up with something not fully compliant (I'm thinking of those first-gen 4K LCDs that wouldn't accept a 60Hz input)
Although the topic was about smart features so it isn't really OTA or cable we care about. Services like Netflix and Disney+ have embraced those new technologies. I don't think technology like 3D was ever embraced to this extent.
If you compare setups between a new TV with a few year old Roku, AppleTV, or whatever versus a few year old TV with a brand new streaming device, the setup with the new TV is likely going to be the superior option. In fact, Apple hasn't even released a new AppleTV since 2017. I have no idea if this will be a trend that continues, I just don't think "the streaming hardware becomes outdated quicker than the display hardware" is guaranteed to be true like we have assumed it was in the past.
It's probably about the shape of the curves as well as, if not more than, the rate of advancement.
TV display technology is likely to proceed in plateaus because they tie to agreed upon standards. By the mid 1990s we had the ability to make a 1600x1200 CRT monitor with a 85Hz refresh rate, but even a top-of-the-line TV wouldn't offer much more resolution or higher refresh than a 1965 model-- that's all you could get out of NTSC broadcasts. (Yeah. there were some progressive-scan input formats, but that's still only a token advance)
Introducing external sources (streaming, consoles, etc.) provides a bit of wiggle room to advance the resolution/colour/refresh rate bars, but that's still not going to change the installed base nearly as fast as if they said "here's a new standard format and every local broadcaster starts 8k broadcasts tomorrow."
In contrast, streaming products evolve in a continuous curve. Since there's very minimal, if any, platform standards, they can say "here's 24k resolution", or equally likely "here's a new codec/DRM format/API that old boxes don't support."
New monitor comes out with higher resolutions and hdr? Don't care, old one still works.
New streaming service comes out and there isn't an app for it on my device? Or security updates stop being pushed and now my device is part of a botnet? Now I care.
Going “obsolete” isn’t uniform. A product can go obsolete for your use case without going obsolete for mine. An audiophile might consider a platform that doesn’t support the latest surround sound protocol as “obsolete” while I would not even notice.
Didn't the user already exercise their control by choosing to buy a smart TV? And I don't think many people would say there is a lack of competition between TV manufacturers.
As is constantly brought up in these threads, there is often no dumb TV option available with higher end panels, and all of the TV vendors are incentivized to be equally evil.
I just don't think it's an issue for the majority of consumers. If you went up to random people on the street and ask whether they would rather have a smart TV or non-smart TV, I think mostly you would get dumbfounded looks of "of course I want the one with more features built-in".
Because that's a manipulative question. The true question is "Would you rather have a TV that spies on you, runs slow, stops working after a few years, and sends pictures of everything you watch to some other country, or a TV that does none of those things and lasts 10+ years?"
How is it manipulative? "Smart TV or non-smart TV" is the most vanilla, agnostic way I can think of to ask the question. Once you start adding pros and cons you muddy the whole thing, and all the pros and cons you have added conveniently lead to your preferred answer.
Don't you think it's more manipulative to build a one-sided argument into the question?
Yes, but the argument goes that someone selling a device specifically for receiving content may see you as a customer of that service primarily, rather than as a means to subsidize the cost of the display or device.
Not sure how that plays out in practice, but not a distinction without a difference.
OK, here's how you make a smart TV, essentially. Take a dumb tv with some ports attached to it, plug a chromecast into one of the ports, and then pour epoxy over the whole thing.
Your argument is that this adds value to the chromecast dumb tv pair, despite being a fundamentally destructive operation.
Furthermore, an external box is easy to replace when it breaks or becomes obsolete. What are you going to do when the embedded OS in your sly TV is no longer supported?
No thanks. My TV is a monitor, only. It has never been, nor will ever be, connected to my home network after I first brought it home and updated its firmware to whatever was current at the time.
I bought a Sony bluray player for the bedroom; it had Netflix etc on it, and seemed like it would be a good compromise between it and adding a disc player and AppleTV. About six months ago, the Netflix app stopped working. No updates available, just useless.
Speaking of Netflix, some users had been left out in 480p land when the DRM module was likely broken and used to rip high quality streams, so Google adjusted the certification level to Level 3 (software-based decryption).
I have this issue today. Got a 5 year old samsung TV that was the top of the line at the time and its still more than good enough as a panel but the built in OS has been updated a few times and now its so slow its painful to simply change the channel by pressing the numbers and turning up the volume.
This article summarizes an interview The Verge’s Nilay Patel had with Vizios Bill Baxter [1]. Their televisions are being sold at or close to at cost, and their business model is tied to tracking and selling offerings on their smart TV platform. Roku has a similar model. I believe that in this case, the total monetization value of using the smart tv services has to be higher than that.
You could get a smart TV and just not connect it to the internet. If you were really paranoid, you could disconnect/snip the wifi antenna.
At least my TV still allow firmware updates via USB, so you may not lose that. Not that it'd matter much if you weren't using any smart features, but they do still provide things like improving compatibility with devices (recently HDMI 2.1).
I have a bigger issue with VR headsets. They go a step farther in that you have to use their platform. There's no equivalent of "just use HDMI" in some cases.
No need to disconnect anything - all these TVs have manually
Configurable IP Addresses, just set it and the gateway to 1.1.1.1 or something and it’ll never be able to talk to another device.
Screens can and should last well over 10 years. No matter what hardware or software is put in there, it's going to stop being maintained after a period of time much shorter than that. At that point the convenience of built in smarts, if there even is any, becomes a major inconvenience and you either have to work around it or make a giant piece of e-waste.
Counterpoint: smart TVs are great. My LG has replaced almost every other thing that used to be in my media room. It is my streaming music box (I can target it for casting from my favorite iOS app, and I can use the interface on it for browsing my home network storage if I want). It has Netflix, Amazon, YouTube, Hulu, Disney, Vudu, Google TV, and every other damn thing on it. It has voice search, so I just say the name of the show I want to watch and it finds and displays which apps have that in their catalog. It also has a web browser so I can just watch anything I want. I can start a video on YouTube on my phone and later transfer it to the TV (I can do this from either device). I can cast any tab from Chrome to my TV.
I say these TVs are just the right amount of smart. Maybe there are worse ones (I imagine, having once owned a Sony Playstation, that the software on a Sony TV is atrocious) but LG WebOS is brilliant.
Seconded on the LG WebOS. Hated smart/dumb TVs (Toshiba, Samsung, Sony) until I stumbled onto LG's WebOS. Speedy, works as expected, has most of the streaming apps. I thought the MagicRemote mouse thingy is going to be a gimmick but it's suprisingly nice to use.
Oh, and the checkboxes to disable some of the spying are buried on an industry-standard level of assholeiness, so that's good.
Exactly, and it will be just as easy to use. Why bother with the "convenience" in the first place? Why was all the time wasted designing and producing the built-in stuff in the first place? Why make more stuff to maintain for no gain? When the built-in stuff breaks it will either get in the way or stop the thing from working all together. Why? Just why? There has been no sane answer given to that simple question since this smart tv thing started.
You need to have a moderately big processing complex to do various kinds of tasks in the TV anyways (scaling, HDMI negotiation, temporal interpolation, audio, etc).
Decoding video is not any significant additional BOM cost (mostly just the network interface), and it is more convenient and nice for a significant fraction of users.... plus it allows the manufacturer access to additional revenue streams (e.g. getting some pennies for bundling Netflix).
My samsung tv was connected to the internet which let it auto update and now the basic ui like volume changing is incredibly slow. Its super painful to just switch between hdmi devices.
No, this is a stupid point. You should always prefer a factored product, no pun intended. All of these "problems" are easily solved by buying a dumb tv and hooking a computer up to it. That way, if you ever need a TV without a computer hooked up to it, you can just separate them.
You've got the whole process the wrong way around. First you decide on the software that you want, and then you choose the hardware to run it. I want LG WebOS and there's only one platform on which it runs. Plugging some PC into the same panel without the software doesn't get me there. In fact, I'm sure it would be an endless mess suitable only for people whose time has no value.
Disagree. My 75 yo parents still have a dumb tv. They have different remotes for cable, sound, blue ray player ( some smart apps), fire tv, and apple tv.
Their life is not easier because of this, they don't watch what they want when they want because it takes minutes to get to a different source.
A TV with a good panel and a separate smart stick? :D
Really, that's the best choice. Always has been.
My neighbour called me because his Google apps (particularly, Youtube) stopped working on his smart TV. It's a cheap HiSense or something.
Still on warranty, but the store won't do anything as the hardware is fine, and they say the manufacturer is responsible for software updates. Which have stopped coming.
I'm pretty sure he didn't quite understand my "apps are Google property and they need to be updated every few months but this company stopped doing that" explanation, but anyway, he now has a dumb TV in the kitchen, and he specifically bought it so the wife could find recipes online, log in to Facebook and watch Youtube.
I told him to just get an Android computer stick, which will work just as well and last way longer. At the very least, it can be manually updated, unlike the built-in software.
The right way to fix this is one single smart device, which is the only device connected to the dumb TV. The best of these types don't have a remote. They're controlled by a phone.
If the intelligence is built into the TV then it cannot be updated or replaced -- so either there will again be multiple devices, or the TV will need to be replaced on a frequent basis.
A smart TV makes the problem you describe even worse
Completely disagree on the lack of remote. Hitting a button on a remote is better UI than needing to unlock a phone and hope that it's actually connected to the streaming device (not always the case).
I think the new Chromecast is an admission from Google that the average person still wants a remote.
And in this context; in my 30s and I don't always know where my phone is; My parents, don't always have their phones accessible or know where they are without looking. Also, it's less tactile in the dark.
Perhaps I should say that the best of this type offer the option of using a phone.
I personally prefer a phone but I also understand the preference of a remote. With a separate smart device these types of preferences can be accommodated over time, unlike with a one size fits all smart TV.
My smart TV's built in apps (Netflix, mainly) are incredibly unstable. About 5 years back I started to use a Firestick instead. The TV is best as just a TV.
My single Apple TV remote with only 7 buttons and a trackpad controls everything. The AppleTV itself wirelessly, the TV power on/off and source via HDMI-CEC, and finally the external amplifier volume via IR, which powers on/off itself when sound is received or idle.
The TV automatically switches source to the game console when it's turned on and vice versa.
Couldn't be easier.
I haven't done any fancy hacking or special hw to get this working, just basic Samsung, Apple, a decent amp and a few minutes adjusting the settings of each.
So disable/don't use it? It doesn't even actively listen for "ok google", it only listens if you hit the assistant key.
If you ever have an android phone in your home, it's exactly the same thing (except the phone actively listens by default).
Edit: Also, for fucks sake, context. The person I'm replying to is talking about getting his parents a SmartTV (way worse privacy-wise than Google), for the convenience factor.
And then they forget to update it or nvidia discontinues it and someone hacks in to it and your room mic gets posted on one of those open ip camera lists.
This is not some tin foil hat idea. You can go on reddit and browse peoples personal spaces being broadcast for everyone to see.
I don't mean that it will become a camera, but that it will be listed along side them. "Listen to this random persons living room / bedroom" makes for some entertainment for others.
It takes 2 remotes to get the sound and hdmi to the right input. They are also confused about what apps it works with, or how to get to them (on the phone), and even though they have a Google Home, it only works with a few providers for voice control.
I think I read somewhere that it will soon be more cost effective to build in cellular chips into "smart" devices like this to circumvent people who don't connect them to the internet.
This is the main use case of 5g iirc. Enabling millions more devices to connect to the network. What an absolute nightmare we are about to enter. "Just don't connect it to the network" is no longer a valid answer.
I will start building 5G jammers. The FCC can eat a bag of dicks.
Apparently even 5G-NR has an equivalent of the wifi "unauthenticated deauth" intended for emergency quench of uncooperative devices.
The cellphone location data abuses revealed over the last three years have led to a remarkable increase in the number of GPS jammers out there. Gaussian-noise blurred, the good ones that can't be notch-filtered.
I think the real answer is that the EU will come up with some "Right to disconnect" which mandates that all devices request user consent to connect and retain as much functionality as physically possible while in offline mode.
The technical battle is basically lost on IoT once they can embed modems.
What annoys me most about "smart" TVs is they are complete crap. I tried to use a friend's TV that cost several thousands. It had an "air mouse" remote. But the thing was more laggy than the first time I tried a full GNOME desktop on my Pentium 2 back in the day. Barely usable. I have a Raspberry Pi 2 that's been running for years now hooked up to a projector that wipes the floor with that piece of junk. But he's now stuck with that interface for years.
There's something nice about having a single device you can chromecast to, play Netflix on, etc. I mostly just wish there were open-source firmwares for TVs.
Really it comes down to exposure surface. Just looking around my home, I sit with 100's if not 1000's of devices from various companies and their internet implementation. I have no idea what's going on 'under-the-hood' on all of these and rely on trust. If I were smarter, I would have bought from a smaller group of trusted names like Apple and Microsoft and let them deal with my updates.
Why do you say it’s not difficult? You know people want TVs to have convenient apps and integrations on them. Convincing people to get that functionality from an additional device that they have to set up sounds very difficult.
But how would they track everything you watch on it then?
Can't have you throwing a Heat (1995) DVD without it being recorded in a database, can we? Sure, your smartphone probably pics up the audio, but it may just report that you're listening to Moby.
You can purchase a monitor. This follows the philosophy of buying separate dumb devices that only do one thing. A dumb TV is just a monitor + speakers + a TV tuner, after all.
There's no reason any phone shouldn't be able to control any TV, except that IR ports are no longer popular on handheld devices. Back in the 90s there were wristwatches that could control any TV.
Every few years, some auditor would always flip out about IR data transfer. It would always get escalated because none of the frontline IT people had heard of it.
I’d always chuckle thinking of the boogeyman bad actor employee who decided to exfiltrate customer data to a circa 1999 palm pilot at 9600 baud via IR instead of the dozens of easier methods available.
It does, but only just. There are still a few models around, but the category of large-format dumb displays has virtually disappeared over the past decade. Commercial displays have almost entirely turned into smart TVs that run business apps instead of consumer apps.
I use streaming apps on the TV all the time. I don't see any real justification for this luddite mindset. If you don't like data collection that makes the TV cheaper that's a separate issue.
TVs should be incredibly dumb. They should be screens for displaying stuff. That's it. Nothing else.
No network connection of any kind, no apps, no software beyond that necessary to do basic setup of how that screen works: brightness, input selection, etc.