Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Are there any 4K “dumb” televisions?
436 points by luke2m 56 days ago | hide | past | favorite | 497 comments
With news like [1][2], and problems I’ve had in the past, I would like a TV with a modern resolution, but just inputs and a tuner, no “smart” features. Does anything like this exist?

[1] https://hackaday.com/2021/11/29/samsung-bricks-smart-tvs/

[2] https://www.theverge.com/2021/11/10/22773073/vizio-acr-advertising-inscape-data-privacy-q3-2021

4K monitors. You need a soundbar though or some kind of audio setup + remote. Also no built-in tuner or such but I assume you get a set top box from your service provider or use a streaming device (Apple TV, Fire stick etc.)

4K OLED https://pcpartpicker.com/products/monitor/#r=384002160&P=7

4K IPS https://pcpartpicker.com/products/monitor/#r=384002160&P=2

4K VA https://pcpartpicker.com/products/monitor/#r=384002160&P=4

4K 55" or bigger monitors (there aren't many choices) https://pcpartpicker.com/products/monitor/#P=2,7,4&r=3840021...

The 4K 55" OLED Alienware has speaker but I doubt that it is any good https://www.dell.com/en-us/shop/new-alienware-55-oled-gaming... (actually comes with remote too)

Linus made a video of it https://www.youtube.com/watch?v=L3oqktdx2a8

Last but not least you can go even higher resolution than 4K but these are all IPS only and they are not bigger than 34" https://pcpartpicker.com/products/monitor/#r=768004320,57600...

Another approach is to look not for things advertised as "monitors" but instead look for "digital signage"[1]. Nowadays most of these contain some networking features but they'll be oriented at local control (i.e. by you via something on your LAN), not some third-party control center accessed via the internet.

Anecdotally this is the approach I took ~20 years ago when buying a (then slightly exotic) plasma flatscreen from Panasonic. It is still working flawlessly today, though I keep hoping it will die so I can guiltlessly replace it with something newer/bigger/higher-resolution.

[1] A random example https://www.usa.philips.com/p-p/86BDL3050Q_00/signage-soluti...

Digital signage is good but from my understanding (and I could be wrong) probably over-engineered for home usage. They're intended to be powered on 24x7, and last a long time. Probably more resistant to burn-in too. All of which is good, but if your use case isn't so intensive you could get by with something lower-end. (Especially if you want to eventually replace the device and are looking for an excuse ;) )

The model that I have js definitely over engineered. I have a 2k sign display which uses display port and its 8 years old.

It's It's bit ugly, but as long as you are okay with that.

My direct experience is somewhat out of date, but from an engineering perspective the unit I have is much simpler than any TV, since it lacks a tuner or any fancy video scaling capability, and has no audio capability of any kind (that was a feature to me, since I use an A/V Receiver for sound). I think in general screens meant for signage are probably brighter than most TVs/monitors, but depending on the room that could be a useful feature as well.

But does digital signage have good panels like "real" TVs? Eg. the high end oleds with deep blacks and stuff?

My experience is that not only are the panels generally worse, support for things like color calibration or Dolby Vision is almost impossible to find. Which is understandable because it's not obvious what advantage HDR would bring to digital signage.

While not HDR, I'd think colour calibration would be fairly important for signage.

If your company red is too washed out, or one section of a video-wall menu board is inconsistent brightness, it looks bad on your brand.

It depends. I have an Iiyama 44” and the display, while nominally 4K, is noticeably not as good as a good 32” 4K monitor. It’s not really visible when watching video, but using it as an external display looks horrible close up.

This is one of the ones with an android board in it, and if I did it again, I’d be getting someone sold as a computer monitor.

Of course a 44" panel doesn't look as good as a 32" one, it has lower pixel density. This has nothing to do with it having a bad panel.

The pixel density difference is about 30%. With careful close observation, you can easily see the pixels in each one. The monitor has smooth solid colors. The Iiyama looks more like it's a native 1920px wide with some 'interesting' pixel layouts for more dynamic range. It looks like complete crap with things like window titles, the MacOS blue highlight, and other gradients.

This is good advice but if you're looking for something fancy like HDR or DV you will be dissapointed as (in my experience) digital signage displays often lack those features.

I have a big 4k tv that was destined for a sports bar but it was slightly damaged, got it for a great price. Sadly it doesn't have HDR either, but it is an older model. Anyway, its great as in there's no WiFi. There is a network port but there's no streaming apps or anything installed on it. I use it paired with an nvidia shield and a nakamichi soundbar and have been enjoying the experience.

But I digress, look for business displays

Same thing I did 20 years ago. IIRC it was the only way to get a flatscreen at the time. Mine was a NEC. 40" plasma. 1366x768. Not even full HD. I think it would work out to about $7000. It pissed me off that it didn't even come with a wall bracket and I had to shell out another $600 for that. For the price of just my wall bracket you can now buy a pretty nice 65" 4K screen I'm sure.

Challenges are features like HDR and having four HDMI inputs.

Planar makes incredible and also expensive displays for the commercial market. They offer a "luxury living" solution but the smallest is 100".


Lots of places call these "commercial monitors".


With 4k monitors you usually pay a larger premium for latency, refresh rate, gsync/freesync, etc. All of which gamers care a lot about but are irrelevant for TVs.

Not really, just like TV's there's certainly a 'premium' range for those that are interested but there's a wide range or regular 'work' monitors. e.g. a few years ago I got a 43" 4K monitor with 10-bit colour-depth, 60Hz refresh, good local dimming etc. for $800.

In OP's link for IPS, a monitor of the same size and brand that seems to be the next version after the one I got is $550. It's hardly a premium over a comparable TV.

I got such a monitor earlier this year for around $600. It's amazing for work but could totally work as a television with an appropriate device connected, especially since it has decent, loud audio built in.

Console gamers care about TV latency too.

Why does (consumer) monitor tech always seem to lag TV tech by a few years?

It looks like the situation is still that in the 4k OLED space there are a few ~$4000+ monitors and dozens of ~$1000 TVs. Per the pcpartpicker link, maybe the Gigabyte FO48U will change that, but it's still out of stock. Besides, I feel like this has happened before with HDR and 4k and IPS. First it shows up in TVs, a year later it is cheap in TVs, a year later it is expensive in monitors, and finally it becomes cheap in monitors. But it takes years. Which seems odd, since surely they use the same panels? Is it an industry structure thing, where panel manufacturers integrate and co-develop with TV manufacturers but monitor manufacturers are separate, only get the panels after release, and need a year or three to turn things around?

>Why does (consumer) monitor tech always seem to lag TV tech by a few years?....

Monitor used to have "much" lower input latency, higher PPI, much higher refresh rate and generally higher reliability because they are expected to be constantly on. i.e Their panels have different specifications.

Although I am not sure if most of the above are true anymore especially with OLED. Given how TV manufactures have also had focus on gaming. But reliability is still a thing on monitor. That is the similar to reference TV that uses panel from one of two years prior.

Edit: I had to look up Panasonic TV set and panel and then I discovered they are pulling out of TV production and outsource to external partner. Sigh.


Yeah, I use gaming mode for my TV-as-a-monitor and if I don't the lag is noticeable even on the desktop. It has the nice side effect of disabling the obnoxious sharpening filters, too.

My dedicated monitors have had dismal reliability: one died right after the warranty, one died inside the warranty and they flaked on the warranty anyway. My reliability expectations are rock bottom, my TV will have to work hard to undershoot them.

> Why does (consumer) monitor tech always seem to lag TV tech by a few years?

Well, “always” seems like an exaggeration; consumer monitors were far beyond 480i before consumer TVs were.

> It looks like the situation is still that in the 4k OLED space there are a few ~$4000+ monitors and dozens of ~$1000 TVs.

That’s not monitors being behind in tech, that’s TVs being cheaper because of economies of scale and opportunity for ad serving and data harvesting.

> Is it an industry structure thing, where panel manufacturers integrate and co-develop with TV manufacturers but monitor manufacturers are separate

AFAIK, LG, Sharp, Samung, and Sony are all four panel/TV/monitor manufacturers; I dont think that’s an issue.

Scale and ads are plausible explanations for why monitors are behind in tech, but they're still behind in tech.

I don't quite see how "TVs are cheaper because they earn money beyond the sale and have more economies in scale" translates to "monitors are behind in tech"? They have similar tech, but at different price points.

Everyone shops at a price point, and the existence of a $6000 professional monitor just isn't at all relevant for most people. Consumer monitors are years behind consumer TVs in tech.

EDIT: Actually, I did specify that I was talking about consumer TVs. You didn't read my post, and then you decided to nitpick anyway. Bravo.

What a weird nonsensical statement.

Monitors and TVs are manufactured with the same "tech", just to different specifications to fit their desired purpose/niche, and to capture the maximum possible value from that market.

You could maybe make an argument that Samsung panel tech is behind LG's or something, since companies have separate R&D labs and actually have different technology, but in order to do so you'd have to be an industry expert.

In what world is a comparison "nonsensical"? They both displays pixels. Each can be substituted for the other with a modest amount of non-panel-related effort. They compete. We can compare them.

> the same "tech", just to different specifications to fit their desired purpose/niche

Clearly not. I am using a TV as a monitor right now, because 4k + OLED + HDR + 120hz was just not available for $1100 in the monitor space six months ago (I think there was a $6000 offering, lol). Looks like it still isn't. This situation has been going on for years. Before OLED it was HDR, before HDR it was 4k, and so on. TVs are always far ahead, monitors are always far behind.

I'd rather not use a TV as a monitor because it's a PITA. I have to put up with substantial non-panel-related silliness to make this happen (turn the TV off/on with a remote, deactivate the laggy filters, tolerate the "smart" BS, etc). If monitors are so well tailored to their own niche, why are they losing so badly to a competitor who isn't even trying?

> to capture the maximum possible value from that market

That's the only explanation I can come up with: monitors are a backwater that the industry just doesn't care much about because volume is lower. Tech has to trickle down, and that takes years.

> TVs are always far ahead, monitors are always far behind.

Your own description isn't of TVs being ahead in tech, but offering the same tech at a lower price point. (There often is some actual tech lag, for many of the same reasons, but it's much shorter.)

> I'd rather not use a TV as a monitor because it's a PITA. I have to put up with substantial non-panel-related silliness to make this happen (turn the TV off/on with a remote, deactivate the laggy filters, tolerate the "smart" BS, etc).

Usually, all of those except for the filters are effectively bypassed when using an input that supports CEC.

> Your own description isn't of TVs being ahead in tech, but offering the same tech at a lower price point

I specified consumer TVs. You didn't read what I wrote, and then you decided to nitpick anyway.

> Usually, all of those except for the filters are effectively bypassed when using an input that supports CEC.

Yeah, I heard about that, but evidently it needs more work before it Just Works.

that looks like a price problem, not a tech problem. you said it yourself, the tech exist, just much pricier.

And like the other person said, one of the reason is just basic scale. TV is multitude much bigger market than monitor ever is.

TV's are such a big business that they overwhelm the rest of the display manufacturing world. That's why 16:10 monitors basically disappeared - 16:9 is 1080p is a TV.

On the other end of the spectrum is professional industry displays which are ahead of consumer facing devices, like are shown at NAB (vs CES) and there you'll find 8k monitors for tens of thousands of dollars.

Maybe in the high-end only?

Speaking of low to mid end tvs, the ones I saw on display in local shops, they were just overpriced junk..

Even though it's smaller, I installed my 7? year old 24" benq fhd e-ips monitor as a tv for my parents. $120 + $20 for the cheapest 2.1 sound (I think 2x10W + sub), cranked the bass much higher than advised, put the speakers behind the monitor and the sub on the floor + ISP tv box with remote. Speakers and monitor are always on, they got their own power saving stuff. My parents are ecstatic, guests are asking where they got the TV from... apparently it looks better that the ones you could buy for $500+...

Last time I checked, I remember finding somewhere most tvs don't actually operate at the advertised resolution, they got all kinds of "prettifying" algos. Not going to trust them ever.

The panels have the advertised resolution, but yes, for "smart" TVs you always have to figure out how to turn off the gross sharpening/compression filters that they use to win the Great Best Buy Screensaver Battle. It can be done, though, and certainly if the manufacturer wanted to omit them in a monitor offering it could.

Two of the most important settings:

Game mode. This turns off most/all the image processing, which greatly increases the lag.

Overscan. Also turn it off. This zooms in the picture a little to crop out artifacts around the edge.

^ This.

Monitors are built for being an arm's length away. TVs are built for being several yards away. The pixel density changes accordingly

There are monitors built specifically for digital signage, these have the same specs as large TVs but no tuner or adware.

And so does the ability to use IR or some other remote control mechanism.

A “dumb tv” would just be a monitor with a remote to control power and volume.

TV manufacturers can offset the lower price by selling ads to show you on your “smart” tv

That wouldn't explain the delay, and do they really expect to sell $3000 of ads per customer? I have doubts.

> That wouldn’t explain the delay

That and expected maximum market size (or, more precisely, expected shape of the demand curve) do, I thimk, explain the delay, and higher price even before considering subsidy from advertising/data revenue, because there are fewer units to amortize fixed per-design production line costs across.

4K OLED laptops are more available and at a much smaller premium, perhaps because people buy a lot more laptops than desktop and larger monitors.

They get money from the ads, they get money from selling your usage data, they get money by selling space on the remote for streaming apps, and probably through some other means as well.

Sure, but my intuition says they might get a few hundred dollars that way, tops. Is my intuition off by an entire order of magnitude?

Vizio, as a public company, now has to share their ad revenue data:


> from $10.44 to $19.89.

Yeah, I thought $3000 sounded silly, and my only mistake was that I thought it was one order of magnitude silly when in fact it was two orders of magnitude silly.

No, that's completely wrong.

That number is specifically for their SmartCast subscriber service. It's not clear what the rate is, but they subsequently talk about Roku making $40/mo; so it's possible that's the monthly rate. Assuming it is monthly, a television lasts for five years, and that is their only other source of revenue from the televisions, that's ~$1200.

The telling part of the article:

> ...[Vizio's] Platform Plus segment that includes advertising and viewer data had a gross profit of $57.3 million. That’s more than twice the amount of profit it made selling devices like TVs, which was $25.6 million, despite those device sales pulling in considerably more revenue.

If those are actually monthly figures you might be right, but I'm still not convinced that they are. $40/mo sounds implausibly high to me. Even if Roku, in a non-monopolized space, managed to swing a hefty 30% cut, that would mean an average of $120 spent on streaming services per month.

Are sports channels really expensive? Is that what I'm missing?

Most of the ad money comes from WatchFree Plus app on the tv.

"Vizio execs said 77 percent of that money comes directly from advertising, like the kind that runs on its WatchFree Plus package of streaming channels, a group that recently expanded with content targeting. The next biggest contributor is the money it makes selling Inscape data about what people are watching."

>The next biggest contributor is the money it makes selling Inscape data about what people are watching


your argument requires that advertising is cost effective for the advertiser. what if therre was a competing ecosystem, in which advertisers pushed up prices overall in a bid to out-compete each other? essentially, the cost of advertisments is added to the cost of consumer and all other goods. advertising increases cosumer costs, and decreases consumer choice.

> Why does (consumer) monitor tech always seem to lag TV tech by a few years?

because there's more money to be made selling TVs than monitors?

consequentely, it's TV manufacturers pushing the entire display maker industry ahead? and so they get the newer tech first??

That's what I suspect, yes, but if that's the case it feels like integrating a monitor manufacturer would be a quick and easy business win for the TV guys.

> it feels like integrating a monitor manufacturer would be a quick and easy business win for the TV guys.

What does this even mean? The same companies that make panels for monitors usually make panels for the TVs as well. They already have production facilities that can manufacture panel sizes ranging from cell-phone size to 200" commercial wall panels.

Most people don't buy the volume to get a special size panel. Want a panel, you can save a ton of money buying one we already make. My company has obsoleted perfectly good embedded systems and had to redesign a new UI just because the panel we used went out of production. (I knew all along doing pixel perfect UI instead of one that scaled was a stupid idea, but I got overruled, now we spend a ton of money making our UI scale)

Volume? Whether we're talking about TVs or monitors, the most competitive offerings are always the segments that sell in volume.

Just because OLED tech "exists" doesn't mean the equipment exists to make it economically at any particular size, format, etc. We have affordable TV-sized and phone-sized OLEDs because LG has invested in the equipment to make those particular panels in those particular sizes.

If I had to guess I’d say it’s just market size. I’d bet there’s a larger market if people who want a large, high definition TV for movies and shows than there is for people who want a high definition monitor.

Most business uses for monitors don’t require high definition, so you’re really looking at specific industries and gaming.

Excluding that Mac market, I don't think Apple sell anything new anymore that is under 200 PPI.

TV display quality is dogshit compared to monitors. Even cheap low-end monitors tend to have better displays. They aren't the same panels at all.

Nope. I'm using an OLED TV as a monitor on my main PC, and it kicks the pants off any monitor I've ever used before, including the 2021 MBP monitor I'm typing this on right now.

Which one are you using if you don't mind me asking?

LG C1. I have to turn it off and on with a remote, take the usual OLED precautions, and tolerate its "smart" nonsense, but the color is gorgeous and the contrast is magical.

Does autodimming work decently on C1? It's really bothering me when I try to use my CX as a monitor.

Agreed, auto-dimming is rough. I turned it off. 60% constant brightness for work, uncapped HDR mode for play.

How do you switch? Can you do it with just the keyboard?

I boot into Windows for gaming and when I launch a full screen game an "AutoHDR" badge pops up and the brightness limit is lifted. A similar thing happens when I launch a streaming app in the TV. I don't consume content through Linux because as far as I can tell linux doesn't support HDR yet.

Speaking of booting into Windows, I finally figured out how to make it painless: use a separate hard drive, not a separate partition. I wish I could go back in time 20 years and tell myself that. The number of hours I wasted debugging poorly written installers, bootloaders, and updaters exceeds the cost of hard drives by a factor too terrifying to calculate. Ah well. Now I know.

Any worries of “burn in”? I read the risk of using one as a monitor is that with a computer there is often static images like your task bar. Those can burn into the screen permanently where as if it is just tv the image often changes. Shows like news often have a bar at the bottom and was warned those too can cause burn in. Curious what your experience has been? Thanks

Yes, OLED care is a concern, and I take the usual precautions: no fixed menubars, no tiling WM, rotating desktop wallpapers, and reduced brightness (which isn't a compromise -- anything above 80% makes light-mode content uncomfortable, and auto HDR raises the limit for actual HDR content).

Even if I were not taking these steps and generally abusing the monitor, I wouldn't expect to see burn-in yet, so I can't really speak to how the situation will develop.

I used to have the Acer B326HK (32 inch 4k) which is marketed as a monitor and it still had really bad burn in

Isn't that IPS though?

same question

I'm using a Sony 43" X720E. IPS. I'm quite happy with it, can't complain. It's as good as any monitor I've seen except OLED, and the size is wonderful. I would like to have 120Hz though.

I will eventually go to OLED but 48" is the smallest size OLED TV available, and that's a bit bigger than I'd want on my desk.

Hopefully a reasonably-priced 43" OLED will come out.

You can get OTA tuners for incredibly cheap. Like $30 for a basic one. These come with the bonus of allowing you to plug in a USB HDD and record live TV. For a little more you can get a HDHomeRun or Tablo and have a network connected tuner so you can stream live TV to tablets or phones and streaming boxes like the FireTV.

Powered bookshelf speakers are also an alternative to soundbars.

I personally use a monitor as a TV. One con is that some devices like the Fire Stick don't send HDMI display off signals but instead a black screen in sleep mode which wakes the monitor and keeps it on. You need a smart switch to easily turn it off.

I would assume most setups leveraging a monitor as the display would also be going through an AVR and that should take care of this kind of thing?

Sound bars and powered speakers are also an option and don't require an AVR but also typically don't offer HDMI/AV pass-through at least on the cheaper side. Going the AVR route adds even more cost and more space as a decent sound bar or powered speaker set costs less then even a low-end AVR. I went the monitor route as I just don't have room for both a desktop PC setup and a TV. I do have an HDMI switch with audio extractor but that also picks up on the Fire TV stick and auto switches to it. TBH it's just a design flaw with the Fire TV and I really wish Amazon would fix it but I bet it saves them 7 cents or something to do it this way.

That's good to know. I've been through a few units (Denon, Pioneer, Onkyo) and they really feel like the weakest link in my setup, with fussy menus and strange failure modes involving cryptic error codes— the Denon in particular would go into a fault state that was probably a thermal problem but might also have been a voltage regulation issue.

It's definitely overkill given that I'm only driving stereo speakers anyway, so maybe next time I have issues I'll go this direction.

It's amazing how much more expensive those are than traditional TVs, non-starter even.

Economies of scale, and subsidies. TVs that ship with Netflix buttons on the remote, Prime Video app, and built-in crappy ads all over the place are being subsidized by those companies.

Meanwhile, no one is buying non-smart TVs, so lower quantities are more expensive.

(Or they know that non-smart TVs are a niche product that they can charge more for.)

Netflix is paying to have the button on the remote? I actually thought it would be the other way around.

Here's a nice NEC 220" display. Helpfully, BH offers monthly payments.


$317,999.00 $13250/mo. suggested payments with 24 Mos. Promo Financing* Learn More Important Notice This item is noncancelable and nonreturnable.

Well… it is 18ft diagonally. Of course, you need a place capable of housing such a thing.

A stadium?

4K OLED monitors are insanely expensive. You can get an LG 65" OLED TV for $1,800. The OLED computer monitors I have seen start at $4,000.

AW5520QF (55" 120Hz) is on sale for $2500. That's getting down to about double the cost.

A Gigabyte FO48U (48" OLED, same LG panel as the C1) goes for around $1500

83% of the price for 54% of the area.

For use as a monitor, smaller panels may be more desirable, more pixels per unit area.

I don't want a 72-inch wall monster. I'm going to be viewing it from about a meter from my face.

But that's just my face. Other faces might work better.

The price spread seems so drastic, more than four times. There is obviously much that I'm missing.

>Apple TV, Fire stick etc.

That's just offloading the problem to a separate device.

I think they help by siloing the snooping.

Our smart TV seems to actively try to figure out what is attached to the HDMI. Its probably reporting that back. At least every time I plug my notebook into the tv it seems to wait at least 20 seconds before forcing me to select "PC" as the input device. The old tv the notebook shows up instantaneously.

All TV's are dumb tv's if you only use them as an external display/monitor and don't connect them to the net. I have a dedicated computer for a media center and just use HDMI1 input on the TV. Never enter menus. Never update the OS. Never agree to anything. Never let the TV "phone home." Never set up wifi. Never connect a CAT5 to it. Set the input using the remote and forget it. Treat it as a dumb monitor. Computer is connected to the net, TV is not and has no way to access it.

That's sadly not really true. I have an LG that I (thought) I was using this way until one day in the middle of watching some TV I get a prompt about a OnePlus phone trying to control my TV, do I want to accept? Needless to say and didn't, but I was baffled by what happened. Turns out that the stupid TV is controllable via an app over bluetooth, and there is no way to turn bluetooth off. I'm just stuck with my TV constantly advertising it's presence to everything around it.

My LG tv started having an annoying popup message every few seconds, "Unknown device is disconnected", which was caused by a faulty WiFi module, documented here: https://www.theraffon.net/spookcentral/tcp/2019/07/10/lg-sma...

Since I use a Roku stick for streaming, I have no need for the WiFi module in the TV. I was able to follow the instructions in that post, which involve removing the back of the TV and physically disconnecting the Wifi Module, and correct the issue.

I suppose that's one way to make sure the TV is not silently connecting to WiFi, although I'm not sure how difficult that operation would be on other manufacturers.

Be aware that Roku is one of the worst offenders in selling your watching habits.

What are some better alternatives?

Plex, if you have a spare machine to run it on. Free if the household share a username/password.

Plex also phones home, I thought. I could never understand why anyone uses it. What is wrong with mplayer running purely on a local client?

Local playback is fine, but greatly restricts your options for media boxes. The top end Apple TV 4K for example only has 64GB of storage, and the Nvidia Shield TV caps out at a paltry 16GB. The Apple TV would fit a few shows if you were willing to rotate them out, but wouldn’t be able to provide a “library” experience. The Shield supports external storage, but who wants an external HD or NAS taking up space on their TV stand?

So local playback implies something more like a mini-ITX PC or SBC running some flavor of Linux. That’s fine, but it’s not going to be terribly couch-friendly since it’s going to have a bog standard desktop UI.

So the most popular option is to put all your media on a server in the closet (usually an old laptop or raspi or something), with a client on your streaming box like Plex, Kodi, Infuse, etc connecting to your server.

> with a client on your streaming box like Plex, Kodi, Infuse, etc connecting to your server.

Never heard of Infuse but don't Plex and Kodi both contact the vendor's server? That is enough to make me not want to go anywhere near them. I do know there are home media center distros for the RPI and PC and they are supposedly nice. I'd be ok with using one in principle, but I haven't cared that much about media UI to bother. I just use command line mplayer on my laptop if I want to watch a video, and that's good enough for me. I can understand other people wanting a more TV-like experience, which is also fine.

Plex definitely connects to third party servers, primarily for verifying subscription status (some features are tied to a subscription) and to make connecting to your Plex box from outside of your local network easier (no IP addresses or something like dyndns).

Kodi I’m not sure about. It was originally XBMC (XBox Media Center) and is open source so even if there’s so phone home element, it can be built without that. Another open source option is Jellyfin, which is a fork of Emby from before that project was closed off.

Why does plex have subscriptions at all? I understand (at least theoretically) that not all software is free, but charging for it the normal way (you pay money and buy a copy that you then use without it phoning home) works perfectly well too. Anyway I had had the impression that Plex also sends at least the metadata of your video library to its home server.

I didn't realize Kodi was XBMC or that you could build it yourself. Thanks for that info.

Plex pulls the metadata for your library from public sources like thetvdb.com.

The server and clients use plex.com to authenticate users.

There are alternatives like Jelly Fin that should better alleviate your concerns.

Kodi doesn't phone home.

* Plex has clients for "smart" devices (Apple TV, Roku, smartphones, tablets), so you don't have to use a computer just to watch TV

* Plex tracks what you've watched so you don't have to remember what episode you left off on

* Plex serves up rich metadata when browsing your library

I use Jellyfin. It's great FOSS!

Plex isn't an alternative, its an app that plays content you host yourself, plus whatever sponsored crap Plex wants to shove into the UI. For people wanting to watch Netflix or Amazon Prime, this isn't a meaningful suggestion.

The Apple TV is a drop-in replacement for a Roku or Fire TV that lacks built-in advertising

I use an AppleTV 4K on every TV. Media is on a NAS (TrueNAS Mini). I use a Kodi port called MrMC (~7USD in App store) to mount the NAS shares and play all the media (supports NFS and SMB). Run a container (on the NAS) with MySQL that MrMC talks to that syncs status between all instances. Works great. Watch in one room, pause and pick up where I left off in the next room.


Apple TV with Infuse. https://firecore.com/infuse

It can connect to Plex, Jellyfin, SMB, NFS and more. Bought a lifetime licence a while back before the price was increased and no regrets so far.

XBMC/Kodi, or AppleTV if you want a managed ecosystem.

AppleTV is not analogous to a Roku stick however, unless I'm misunderstanding. It is an app that already requires a connection to Wifi from the TV; Roku stick is what provides this connection. Additionally, AppleTV does not support Netflix or Disney+ among others. [1]

[1] https://9to5mac.com/2021/06/29/apple-tv-channels-services/

Pedantic Post Alert, but there's a few different products called "Apple TV". There's the hardware box called Apple TV that comes with a remote and plugs into the HDMI on your TV. That IS an alternative to Roku.

Then there's the Apple TV app, which runs on the Apple TV box. This can integrate with some other streaming services and show you their content in the app. Not Netflix or HBO (or some others) though.

And finally (that's all?) there's the Apple TV+ service that's 4.99 a month and lets you watch shows that Apple produces in the Apple TV app on the Apple TV box (or your iPhone, iPad, etc.)

It's all named very well and clear.

> It's all named very well and clear.

I really enjoyed this.

Apple’s handling of the Apple TV product names are baffling to me, given their once brilliant marketing department.

Ok, I've been set straight. I wasn't able to get the full picture searching online.

Confusingly Apple actually has a few ‘Apple TV’ product. There’s the app that you’re describing then there’s the physical device, as well as the subscription service.

The physical device is the Roku competitor, and supports Netflix/Disney+ etc.


The app is different. You can certainly use Netflix on a recent Apple TV [1].

[1] https://help.netflix.com/en/node/23887

AppleTV is a device like a Roku box or stick. AppleTV is also an app on iOS devices, roku boxes, etc. AppleTV the device has Netflix and Disney+ available, among many other apps.

I'm wondering if there is a wiki somewhere detailing the different ways to easily disable wireless connections inside consumer TVs...

From the link: > I taped the loose ribbon cable to the inside of the TV with a note saying, "Wifi Module disconnected, so as to disable 'Unknown Device is Disconnected' message." That way if anyone ever looks in there in the future, it will be known what was done :-)

excellent :)

Also, HDMI cables support ethernet connections.


Sadly - there's basically no hardware support for this. Dead in the water.

That's what large TV manufacturers want you to believe ;)

The same TV manufacturers that are still shipping 10/100 Ethernet ports on TVs in 2021?

You got any evidence to back this up? Hardware that supports this feature is basically nonexistent.

For anyone with LG TVs, iirc there’s a project called OpenLGTV which is working to reverse engineer LG software. Maybe it could help disable some of these “smart” features?

I imagine they'll happily let your neighbor accept the terms and conditions you have yet to accept too =P That's a failure mode I hadn't thought about, I didn't realize people tried to control a TV with bluetooth.

Why would you do this? Initial configuration but then never again? I can't think of a technical reason it meaningfully helps when you can already type in a wifi password with the remote, so I'm inclined to assume that the feature isn't for the customer but rather because they want you to use their app on your phone because the data on your phone is more valuable than the data on your TV.

I used to quite like BT control of the TV back when my kids were babies. Advance warning though, these are going to be pretty niche use cases…

I would feed them and they’d fall asleep in my arms but sometimes I would be terrified to move them in case they woke (and sometimes I’d just enjoy cuddling them as they slept). However the TV remote might have been too far to reach, whereas my phone was always in my pocket.

BT became useful again when they became older and started playing with the TV remote. It was always getting lost. Whereas my phone wasn’t.

So that does make sense, for sure, though my thinking was about how once it's connected to your home WiFi surely your phone is too and is less range limited.

But in terms of involving installing an app on your phone to emulate a remote, both are the same so it's no different. At least it's for more than just to configure it, mandating a connection to a phone you may or may not have seems like a stretch. At least by making it possible to use your BT for a remote you're expanding the features rather than breaking them.

I was given a nice TV with no remote. Used Bluetooth remote app to control TV.

Well, that was mostly a neighbor who accidentally clicked on your TV's bluetooth broadcast signal. But still if you leave it at that and not let your TV connect via bluetooth, it still remains a dumb TV

We have a pretty bad relationship with one of my neighbors. Not gonna get into specifics, but they're quite immature and petty and vindictive, and just generally not pleasant people.

Anyhow, late one night we saw something popup in the list when I was pairing my headphones. "$crappyneighbor's TV". And we saw an opportunity to be a little petty back for once. We connected my phone, and apparently their TV model features no confirmation because it went right through, and started blasting Rick Astleys "Never Gonna Give You Up" until my phone disconnected a couple minutes later. Bit of good harmless fun at their expense.

> and there is no way to turn bluetooth off

Challenge accepted! <grabs pliers and soldering iron>

Warranty voided?

A warranty is never voided if nobody knows what you did.

The profit margins are quite thin on consumer equipment; they can't afford to investigate into the chassis, having someone look at every chip that might have been tampered with.

Return it to Amazon, you could fill the box with rocks and they will happily ship it on to the next customer.

Most electronics warranties on stuff I've bought have been a year, max. Out of that period? Antenna snipping time!

...if there is one to snip, of course.

You can disable bluetooth with some effort - find the BT antenna (inside) and replace it with a load terminator - the radio thinks there's an antenna still but there is nothing that can be broadcast or received.

Which model have you got? I have a few LG TVs and there’s various options across the different firmwares that might disable that.

Eg LG Connect Apps

I’ve also often wondered if “store mode” disabled all of the radios because that’s the kind of thing you wouldn’t want enabled in a store.

It's an LG OLED55C9PUA. FWIW I went though all the menus, as well as searching the internet, and couldn't find any way to turn it off, nor as far as I could tell could anyone else who discovered this "feature".

Bluetooth on a phone can be disabled.

I only turn on bluetooth when it is needed.

Wired headsets and wired keyboards will work with some of today's phones.

That doesn't sound like a great user experience. "I have this TV that I want to use as a monitor, but I have to turn of a feature on my unrelated device which I want to use with a headset, keyboard, etc". I would rather buy a dumb TV that didn't require me to lose phone features.

It wasn't my phone trying to connect to the TV.

If this is an argument that all TVs are smart TVs, then all TVs with an IR control input device are smart TVs (because universal remotes).

No of course not. The moment you - possibly accidentally - grant the fucker an internet connection over BT, it starts to rat on you. However briefly.

Aything with an IR remote does not betrays it's users like that (through the remote)

> grant the fucker an internet connection over BT, it starts to rat on you

Or it does that by itself, with a mesh network that your neighbours have setup by accident with their Alexa or Ring - Amazon Sidewalk is an amazing end-run around your own firewall rules.

Thanks for giving me another invention to be angry about, haha.

Makes you wonder about the opportunities for poisoning of Sidewalk networks, just to get some petty revenge.

that's not petty, that's downright patriotic. we have an inalienable right to privacy, security, and liberty, and absolutely no obligation to let companies (or governments) invade or curtail those rights.

> we have an inalienable right to privacy, security, and liberty

I assume you’re referring to the US constitution. The rights you’re referring to constrain the government, not companies. You do not have a legal right to privacy w.r.t. your TV spying on you.

no, not just the constitution, it's inalienable because it's intrinsic to being civilized people, not because some piece of paper says so. on the contrary, companies have no right to spy on us.

Yet they seem to be able to keep on doing it just fine.

so people still kicking dogs makes it ok?

note that a right isn't a passive trait, but an active assertion. every time we give in to what's easy, we lose a little bit of our rights. you maintain rights by speaking out and living by them. we wouldn't need a second amendment were that not the case (n.b., i don't personally own nor desire a gun).

I wonder when home security setups will start including a Faraday cage, at least for certain rooms or areas.

If you build it into all of the exterior facing walls, you can use a cell range extender to tunnel a data connection inside your house, and also similarly with a Wi-Fi AP connected via Ethernet, in the event that you want Wi-Fi signal outside your home as well.

Actually, when I build my next house in 5-10 years I think I might do exactly this

Not gunna lie, i'm looking at building/remodelling a home in the next few years, and i'm seriously considering foiling the walls of some rooms just to build a faraday cage.

It seems way simpler to short the wifi antenna or something than to redo the walls? Bluetooth is probably harder to fix though, if the TV has that.

IR is line-of-sight, and the receiver can be blocked with a small piece of electrical tape.

Same for bluetooth, just build a faraday cage around your TV and you're good!


Only if it's grounded, so that's a step up from the tape.

Bluetooth can be turned off in the settings.

It can't. Which is sort of the point of my comment. If it could I'd have no problem with this feature.

I mean sure, it's (probably) not an internet connection, but Bluetooth and an IR remote aren't really comparable. Bluetooth exchanges information in both directions, IR does not. Bluetooth works from hundreds of feet away (or more) with no line of site, an IR remote does not. Bluetooth allows for broadcasting arbitrary content to my TV, an IR remote can only change the channel. Given the relative complexities I'm also far more concerned about a security vulnerability existing in the never updated random Bluetooth module/drivers in the TV than an IR receiver that emulates button presses.

My Vizio TV's settings can only be changed with a phone app. It comes with a regular remote, but there's no button for entering the settings menu. For the app to work, it requires to be in the same WiFi. I dodged the bullet by setting up a restricted WiFi (no internet), but that shows how TV manufacturers try to force you in connecting your TV to the Internet.

How old is your Vizio? All modern Vizio tvs in the past 6 years have a remote with a Menu button. And they ask for accepting Privacy Policy before you can connect to wifi. You can very easily avoid connecting a Vizio to a network and just use it as a dumb tv.

I just got a new Vizio, I only connected it long enough for it download the latest firmware, and then neutered it by removing the Ethernet cable.

You can bypass the acceptance of the privacy policy entirely and it just disables all the smart features (can't use any of the apps or cast to it with Airplay or Google Cast), but will still download firmware updates and install them.

From the network capture I did on it, if you don't accept the policies it only reaches out to the update servers for the firmware, and that's it. Nothing else.

Has Vizios software gotten any better? I had a 2019 model and the software (had mine offline/neutered too) was too shitty to even use it as a dumb TV. For example it would randomly reboot while we were watching something via the HDMI input or it almost always took multiple restarts to properly recognize the Vizio sound bar...

I got the new TV three days ago... I'll let you know once I've spent more time with it?

As of right now I have an Apple TV hooked up with my Vizio sounder and it all works really well. Have had no issues.

I have a 2016 model Vizio that came with a very simple remote without a menu button, expecting you to use a mobile app for all settings. They made a more complete remote available 18 months later as a separate purchase. If you didn't buy the remote you could get softlocked not being able to navigate away from the Smartcast channel.

While the remote does have a menu button, many settings continue to be available only via the mobile app.

Oh wow. That is a new low.

I have seen stories of smart TVs doing active scans for any unsecured wifi network in range, then connecting and phoning home without ever informing the user or showing anything in the menus. Is that a real thing?

Not only this, but how long until they start including a cheap modem and paying for their own cellular connectivity?

Or they buy access to things like this: https://www.amazon.com/Amazon-Sidewalk/b?node=21328123011

No need to include a modem when your customer's neighbor has an internet-connected toaster within bluetooth range of the TV.

My god damn CPAP machine does this. Insurance required the modem in order for them to pay for it.

There a number of guides on disabling this. That being said you still need to bring the SD card to doctor every 6 months for review. This is a good thing. The doctor is trying to help you. I know someone that died because they went on a trip and forgot their CPAP and just stopped breathing in the middle of the night and did not wake up. It happens. When you get a CPAP there is a reason. Take it seriously.

If you are having issues because of physically fixable issues there are surgeries that can help (I had 2 things fixed). Most of the time however its just being fat and physical. Loosing the weight helps and the effect can be tracked in the data the CPAP collects. The doctor can then use the data to adjust then pressure down.

I am speaking from experience. My pressure setting has dropped as I took off the beer gut. This time next year I will likely be able to be done with it.

Ugh this is way worse and actually seems far more likely and easy to accomplish than the public wifi thing...

It's going to be time to crack the sucker open and burn out the traces to/from the modem, isn't it.

E: And in the ongoing arms war, then manufacturer's ship remotes that no longer use tried and true IR but instead go over the same bluetooth or 2.4ghz chip. Some probably already do, I would expect.

My new Panasonic TV has dual-mode remote control - IR+Bluetooth. Bluetooth means that you don't need to point the remote as accurately. Plus you can use a phone as remote control.

That's part of the push for 5G - the idea that it becomes much easier to have many more devices, hence you can drop a modem anywhere.

Jesus Christ, at that point I'd just go in the TV and snip a trace in it.

It won’t be long until we start taking a ddwrt approach and jtagging our tvs and replacing them with custom ROMs.

We’ll be okay for another 5-10 years before they securely start locking that down too.

I'm optimistic within 5-10 years we'll start to see the rise of open source tvs that'll save us. I have no sources to back this up but seeing the release of several open source laptops gives me hope.

See also Amazon Sidewalk.

Some also:

1. Scan the HDMI content and send information back to the mothership to help vendors know what you're watching.

2. Scan the local network for shares and look at media on them, again to send back to the mothership.

Do you have a reference for this? I’ve heard this too but have never actually seen a first hand account

This sounds crazily like science fiction. The TV wants to live so will do anything to stay alive. Isn't this what HAL did to the guy Dave in the end?

I’ve heard of this rumour and hope it’s true and that someone has evidence of a tv manufactor is doing this in the US. https://www.legalmatch.com/law-library/article/wi-fi-connect...

I doubt these stories. The number of people who get a smart TV and don’t connect it to the Internet are a minority. It’s just not worth the cost for any company to develop advanced features to spy on that limited subset.

How would the TV phone home if it's not connected to the internet?

Amazon had (perhaps still has) whispernet - a global coverage of cellular networks. It was used to download books to your kindle, essentially anywhere in the world, without having to have a local plan or WiFi.

It was 2G iirc, enough for book download. Uploading a perceptive hash of what you are watching - e.g. a frame every few seconds - also fits on 2G speeds.

Ok, so...

TV phones home a lot, with info about stuff, some info big lots of bandwidth.

could there be an attack with buying some tvs and putting them on public WiFi - and could one find a way to increase the amount the tv was sending so it amounted to an attack - but still have plausible deniability.

If tvs were put on public wifi, which I guess there is no reason why you shouldn't put your tv on public wifi, and the tv is using lots of bandwidth, and your tv is popular in a country with lots of free wifi, is that tv manufacturer guilty of an attack on the free wifi infrastructure of that country?

I'm asking for a short story or several I might write some day.

Also maybe I'm mistaken, but don't most modern cars have a built in modem for sending telemetry to the manufacturer as well?

Yes. Toyota seems to like putting the transceiver behind the glove box, forcing you to introduce a bundle of rattles to your newly purchased vehicle to remove. Also, I'm pretty sure removing it disables some of the front speakers, requiring manual wire connecting to get them back. You could maybe leave the device in there but encase it in a Faraday cage, but still rattles are introduced.

Ford, iirc, places it on the floor under or behind some seats making it much easier to deal with.

I think MA just passed a law about not allowing vehicles sold there to sell your telemetry data or something.

> I think MA just passed a law about not allowing vehicles sold there to sell your telemetry data or something.

i believe they just have to share the data with others, which they don't want to do.

Whispernet is dead from what I understand. As countries began shutting down 3G networks, Amazon was no long able to get connectivity at a price point where it was economically feasible.

Why not use helium network?

"doing active scans for any unsecured wifi network in range"

This doesn’t seem like a practical thing for manufacturers to implement. How common are such networks in actuality? Literally the only time I ever encountered unsecured Wi-Fi networks in the past several years were guest networks, and all of those were gated by a capture portal that would have blocked any sort of attempt at telemetry or ad serving.

A deal with comcast and their ubiquitous pseudo public hotspots would simplify that.

xfinitywifi, the comparable similar functionality from CenturyLink, etc, or if you happen to live just a bit too close to a McDonalds or other business that has open WiFi. And of course the possibility of one of your neighbors screwing up his WiFi setup.

There are a lot of ways this can (and has) gone horribly wrong for privacy.

I doubt any TV does this but if I needed to implement it I would just do it the same way that I use wifi in hotels that have captive portals without paying for it: tunnel IP over DNS.

Xfinity requires login. It is not open.

> doesn’t seem like a practical thing for manufacturers to implement.

In terms of what, lines of code? Engineer days?

Like the parent comment says: scan for some open Wi-Fi in range and connect. If that provides Internet access, it's good to go.

Might have 4g or 5g chips in them. I'm sure they could get a good deal on data if they put it in millions of TVs.

To second this advice, I have a 4 year old 75" Sony Bravia that I did not connect to the internet in any way, despite being an Android device. I have updated the firmware using the instructions on the Sony website, downloaded a package, extracted on a USB stick and let the TV boot on that. Figured it's best to have an up to date operating system for bug fixes, security updates, file format support etc.

Never intend to use the "smart" features on the TV, internet browsing, Netflix etc., I handle that perfectly with my "broken lid" laptop, which is a well maintained machine, typing these very words on it.

So I can vouch that at least for Sony TVs in the KD or KDL series XD, XE, XF, XG (most of them launched a few years ago), you can use them just fine without internet, and you can even update them. You can also turn off Bluetooth and prevent the TV from advertising its presence.

Don't know about the newer OLED and QLED devices, you should try them on in the store.

What do you need security updates for if you don’t connect it to the Internet?

Updates include more than just security fixes.

Yes but to nitpick he already mentioned bugfixes and file format support alongside security fixes (which he wouldn't/shouldn't need). :)

I disagree. You can have security bugs in non-connected devices, for example crafted file formats or metadata that can carry executable payload, worms that force their way in though some insecure Bluetooth receiver, or even by infecting the myriad data channels embedded in modern broadcasting; access to such a broadcast stream might be very lucrative since it will give you access to tens of thousands to tens of millions of devices.

I agree that the target is low value and that the attacker will most likely not bother infecting Android TVs; he would need to force them to connect to WiFi to exfiltrate any data, a very complex and unlikely attack if your TV is not used to monitor uranium centrifuge data.

>Never enter menus. Never update the OS. Never agree to anything. Never let the TV "phone home." Never set up wifi. Never connect a CAT5 to it. Set the input using the remote and forget it. Treat it as a dumb monitor. Computer is connected to the net, TV is not and has no way to access it.

so, are there any TVs where this is not possible? For example as part of turning on there is a setup procedure that makes it phone home and connect to wifi? If so (I wouldn't know but I would expect because natural cynicism) then the question naturally becomes what TVs is what you suggest actually possible on.

on edit: I see jiveturkey just posted that in fact what I suspect would be the case of difficulty to keep it from connecting is often the case https://news.ycombinator.com/item?id=29383963

> Never update the OS.

Unless there are new features or fixes you want. My TV needed updates to support Dolby Vision and to fix ARC/CEC bugs. More recent TVs have required updates to support HDMI 2.1 features.

At least you can download the firmware separately and update through USB instead of a network update.

We have the dumbest modern TV I could find, it routinely decides to ask us to agree to the T&Cs again, and complains about not being connected to the internet.

The problem isn't just that they want to be connected tot he internet, it's that that they're terribly written buggy bloatware devices that glitch continuously when not connected to the internet.

Of course from what others have said it seems like they're also glitchy and terrible when connected to the internet?

If nothing else, it's one more thing in the TV that can break down, and probably add a little bit to the energy consumption of the TV. And if the trend continues, how long until your TV doesn't work at all unless it is connected to the internet?

What’s stopping the TV from taking OTA updates through data sub-channels on TV channels?

Satellite receivers have had this type of capability for a few decades.

That's something that seems like would be feasible in a more monolithic world of broadcasting, and smaller firmwares. They could probably simply broadcast various firmwares at different times; then if the TV detected "hey, that is for me", it could capture the packets.

It's hard to imagine streaming services like YouTube, NetFlix and whoever agreeing to do anything of that sort.

Indeed, my guess is right about this seems to be in the right ballpark:


If you're still relying on DRM functionality, a DRMed source could suddenly demand you upgrade the firmware and refuse to work until you do.

not all. some require a network connection at least to get started. some find an open network and connect "for you"

And some bombard you with "helpful reminders" to set up networking.

I think Gmail.com has asked me about 500 times to try their app and I’ve said “I am not interested” 500 times in a row.

But they seem to have a good feeling about tomorrow.

Same as YouTube Premium, I thought if I did their free trial they're stop bugging me, nope, now I get the same prompts straight up asking me to pay now.

One day you will misclick and then they have you

I need to convince the next generation of judges that 1/500th consent is not full and complete consent!

Anything that requires a network connection should be returned, period.

I guess if the TV really want to exfiltrate data, it can speak via HDMI-CEC to all peripherals connected to it. For example if you have an TV-Box or a game-station, it probably can send it remote control commands to the TV-Box, so it can have the same user interface than you have on your TV-Box (which quite often even when it's connected in Ethernet, can display on the screen the wifi password, or surf the web).

There are also quite often free public wifi in the neighborhoods. Bluetooth may also be an option. Or they can just add a cellular network to get your data. Or maybe they can create a wifi mesh network between nearby TVs and share the internet if one has access to it.

Yes, the key being never let it connect to the internet

I would check what TV you have and what patches are available. I have a smart Sony TV, but I've only connected it to a network twice to get patches. I did so once because it fixed an audio issue that was very annoying.

You can often download patches to a USB stick and install them from that.

How about the remote control?

My dumb TV's remote is so simple, it has room for a dedicated button for each HDMI input. I don't have to go through any on-screen widget to pick an input: just hit a physical button on the remote dedicated to going to that HDMI input.

Cycling through picture modes is just a button also.

Never enter menus? What if you'd like to adjust something related to the display; sharpness or something.

How can you access "Press the Red button" content on a dumb TV though? Is there a software client that can access the content over http?

edit: I believe it's called HbbTV

Smart TVs boot slower, display stupid elaborate graphics and menus when doing simple things, and have slow clunky UIs.

Exactly! No need to have anything but HDMI and power connected.

Kinda? You still have to wait for the TV to boot.

well, until your neighbours set up an open wifi network, or a house member / guest sets up a temporary wifi hotspot with no password.

Most TV's will lock on any open wifi network given the chance - and that's all it takes to upload saved data and pull down updates and ads etc

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact