Hacker News new | past | comments | ask | show | jobs | submit login
Sony Android TVs waste 25W in standby due to built-in Google Chromecast (avsforum.com)
478 points by eisa01 on Sept 18, 2022 | hide | past | favorite | 247 comments



This may not sound bad, but it's the equivalent of 200 kWh per year and the set is only supposed to draw 0.5-1W from the specifications

This would cost more than 50 EUR per year at current prices, and is sizeable compared to the average German citizens consumption of 1500 kWh electricity

Let's just say this is not ideal given the current energy shortage in Europe...


It also is illegal in the EU for new devices. https://ec.europa.eu/info/energy-climate-change-environment/...:

“Network-connected standby devices

Modern appliances are increasingly connected to the internet during standby mode, consuming higher amounts of energy. This includes networked televisions and decoders, printers, game consoles and modems.

- Specific requirements for network-connected standby devices were introduced in 2013.

- Since January 2017 networked standby devices must not consume more than 3 to 12 Watts depending on the product.

This compares to 20 to 80 Watts previously. This decrease is expected to save an additional 36-38 TWh.”

I think this TV is from 2014, so it would have to comply with the older regulations, which I think are https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A..., but that doesn’t mention anything close to 80W.

It does mention 6W and 12W limits, but only starting at January 2015.

⇒ it seems this device doesn’t break that regulation.


I have a Sony android tv and with almost every setting (eg. brightness, motion blurr processing, etc) I change I get a warning it might “increase power consumption”. So it might just be it is testen under the bare minimum functionality. But as soon as you enable any of the advertised features it exceeds that.

For the record mine mostly has 1-2W standby consumption. With some spikes to 10 or 20W for a few seconds sometimes. And once in a while it never enters deep sleep and gets stuck on 20W. Enabling HomeKit (Apple TV) on standby is a sure-fire way to have it standby on 20W permanently. I can’t disable it after unless with a factory reset. I’ve not touched Chromecast settings at all so can’t say if it has the same effect.


There were devices that consumed 80 watts in standby. I have a cabinet full of equipment including my cable modem, a computer (server, no monitor) with 8 mechanical hard drives. The idle current is 76 watts or so. How could something be so high?


Cable modems and servers weren’t designed for standby.

Also, if we’re talking consumer tech from the 1970’s or 1980’s, I think it’s twofold.

Firstly, the market cared more about “time to wake up” than about power usage. People had fewer devices, they were (relatively) more expensive, and people were used to fully switching off stuff, so standby power usage wasn’t seen as a big issue.

Secondly, the tech itself didn’t exist. Nowadays, your power supply has a CPU that knows how much power your hardware needs and can optimize for it. Then, you would be lucky if your hardware had a switching power supply.

CPUs didn’t have power states. If you were lucky, you could halt them and have them start again on interrupt. Hibernation didn’t exist, so in computers you’d have to power RAM, anyways. And because of that, OSes didn’t bother with the complex task of properly powering down peripherals when going on standby.


In the 80s, your VCR would frequently be plugged in, and it would keep pumping out a placeholder blue screen at a minimum, particularly if connected via antenna passthrough.

However startup time would be minimal - a couple of seconds - compared to warmup for the CRT, which would usually be at least 10 seconds before it would look settled. But I recall devices generally being much faster to start up. A C64 would boot to READY prompt in 3 seconds. The penalty for turning off completely and needing to wait for boot was small, and for computing devices, wait time was dominated by loading, not booting. You might have to wait for a rewind with a VCR, and you'd certainly wait for tape loading on devices in the C64 class. Consoles with cartridges didn't take long to boot up and load though.


What power supply has CPU in it?


Every macbook power supply from who knows when does.

Here's a teardown of a 2015 macbook pro charger that includes a Texas Instruments MSP430.

https://www.righto.com/2015/11/macbook-charger-teardown-surp...


That's a microcontroller.


> The MSP430 is a mixed-signal microcontroller family from Texas Instruments, first introduced on 14 February 1992.[1] Built around a 16-bit CPU

From Wikipedia. Emphasis mine.


'standby' for many devices used to mean 'leave all the electronics powered up, but turn the screen/sound off'.


I had a DirecTV satellite receiver with DVR. They rolled out an update and made a big deal their new "Energy Star ECO power savings mode."

I wondered how much power it could really save because it was a DVR with a spinning hard drive that would always record the last 30 minutes of the current channel even if you didn't set it to record anything explicitly.

So I measured it while on and also in "power savings mode" where the video output was off.

14.5W when on and 14.0W when in "power savings mode"

What a joke.


> 14.5W when on and 14.0W when in "power savings mode"

A software upgrade that saves 3% across hundreds of thousands of people doesn’t seem that pointless in the aggregate.


the screen is the primary consumer of energy. i don’t know but imagine sound is likely more hungry in that regard than any of the other discrete systems on a smart tv.

edit: additionally tvs that don’t turn off are in some cases of limited utility.


Not tellies but an interesting look by Dave Jones[of EEVBLOG fame] into how/why simple devices can use up so much power when idle. A lot of it is down to lazy design of the AC>DC circuits

https://www.youtube.com/watch?v=_kI8ySvNPdQ


It’s not lazy, it’s cost optimized. Like so many times, market forces just force any externalities outside the price signal.


It's easy to burn energy when nobody's paying it any attention. Mostly by not really idling, I suspect


Audio amplifiers, especially old ones, can easily draw 100 watts when on 'standby' - ie. the amplifier is turned on but playing no sound.


> 'standby' - ie. the amplifier is turned on but playing no sound.

this isn't standby and is a good way for your vintage amp to become defective


If it is a class A amplifier, I think that would make sense. But how many people have huge class A amplifiers for their home audio?


Old ones...

Everything from the 60's, 70's, 80's and 90's...

Remember in europe we have a lot more old stuff - especially old houses that are family owned for 100+ years tend not to have things replaced often unless they're unrepairable.

Even modern class D amps can be 30+ watts when not playing anything tho.


That's interesting, I've never thought of people with old houses correlating with old tech.


> If it is a class A amplifier, I think that would make sense.

The much more standard class AB still uses 25% with no output, 100 W idle would be a stereo 200 W amplifier. Or a 5x 80 W surround amplifier.


You mean idle "server turned on but not used" or completely off ?

Servers usually have out of bands management builtin and that's on all the time, even if management network is not connected. Should be just few watts tho


Running but not really doing anything. No IO, no CPU activity, etc.


Could there be some consequence for misrepresenting the standby consumption?


The TV came on market in 2015 AFAIK, the C denotes the 2015 model year


If it’s like cars, they started production and sales 3/4 through 2014


Don’t think so, the thread started in April 2015 :)


Not at the current prices for new customers. I'm currently paying 50,07€ct/kWh here in northern Germany.

25W * 8760h * 0,5€ = ~109€ per year.

The price hurts, yes, but the waste of power is _criminal_.


Hah, wanted to say, nice current prices, can I have those? But you beat me, 0.41€, also Northern Germany.


I hope you realize both of your prices are incredibly cheap. New contracts in the Netherlands are up to €0.88 per kWh (edit: seems i underestimated, 0.88 is the average, the most expensive is €1.14 per kWh now.)


I wouldn't count the hours during winter though. It's heating, which you'd pay anyway.


At least with the electricity prices in Germany (not referring to the current price level, but with reference to the last decades), a dedicated source of heat (oil, gas, pellets, heat pump, district heating, …) is normally more cost-efficient than simple resistive heating.

So while it's true that that standby consumption is at least good for something in winter, a lower standby consumption and in turn turning up the central heating a little more would normally still be the better choice.


Very few people heat their homes with electricity. It's the most expensive energy source. So while the heat is atleast useful in the winter, most people would prefer to take the TV idle spend and apply that to a much cheaper heat source.


200 kWh can run my tesla for about 1000KM just to give people a picture of the waste.


More than 50 EUR per year at the prices a lot of us contracted for before the war for 2022, to make things clearer.


Right. I'm in Scotland and its closer to £175/year than €50


Wow, that’s worse than anything I’ve heard about around here (northern Bavaria). Power strips with switches are a sound economy move, in either situation.


Yeah, energy costs are just shy of 4x what they were for us last summer. All the sockets in our house have switches on the wall to isolate them. The real trick is just knowing which ones are worth turning off!


  This would cost more than 50 EUR per year
Try: €200 at current forecasts :/


Doesn't matter in the UK, there's no incentive to turn off the set to save yourself £20 a month, because the taxpayer will subsidise your excess electric use thanks to the latest idiotic socialist idea to simply get future generations to pay for today's excess use.


I am childless and atheist so I don't particularly give a shit about future generations. Those who do have a vested interest in the future of mankind are welcome to donate their money to the State.


> the latest idiotic socialist idea to simply get future generations to pay for today's excess use

You misspelled "get future generations to subsidise the power companies profits today".


Tory government idea?

*for non-uk: the Tories are The Conservative Party, considered right wing, and certainly not 'socialist' by any stretch.

Reading any UK newspaper of late, I would consider a £20pcm saving tip almost front-pageworthy. I also believe Viz might accept it in their Top-tips section, if you want to reach a wider audience.


Supply is low, demand is high, prices have increased as the market does, which sends signals to consumers to reduce usage

The government has taken those signals away by agreeing to pay whatever the price is

They could have simply given £1000, or £100, or £10000, to every household, but they chose not to.


Bailouts are only socialist when they benefit the average person, seemingly :P


These bailouts give money directly to energy producers, taken from future workers.

A far better solution is to give the money directly to the average person and let them choose what to spend it on.


What?

Do you think these price increases are strictly on the retail side?


Providing help to commercial consumers is essential, look at the energy usage last year and base it on that.

However just paying the suppliers directly gives no incentive to invest in increasing energy efficiency - if I'm paying 70p a unit and can spend £1000 to save 3000 units it makes sense. If the government is capping it at 20p a unit then why would I bother paying to increase the efficiency? That means more needless energy use


I don't necessarily disagree with using increasing efficiency as a vehicle for survival in this European apocalypse but you need to consider proportionality/scale of the problem.

In less than two years, the price of the usage of public grid electricity will have increased in the U.K. from ~£1,600 to £6,500. These useless household statistics only help to hide the fact that businesses, who do things with this electricity that don't include running a T.V. pointlessly for 8 hours a day, or heating an unoccupied room, are going to suffer more than anyone. It's easy to take the British consumer's word for it, and understand that a bill which served as a relatively small expense in a past life will now be 4x bigger than it was before, but the reality is that doing things is significantly less affordable. Manufacturing, hospitality, healthcare, public and private services etc. will all be smashed to bits, and their utilities bills aren't going up 300% but more like 500-1500%.

Is it remotely possible to push energy efficiency to the point that we are using half of the energy we were before? It's entirely possible, in my mind. With trillions of dollars of investment.

It's simply not worth it. It's better to take the money and run, if daddy warbucks is offering to give us 3.9% APR on a couple hundred billion dollar loans, than it is to pay in cash and pray for better times with improvements that come along with science and technology.

Loans on the scale that nations take add additional risk to the creditors - the risk of not paying them back is a final one.

If a million people don't pay their credit cards off in a year, it doesn't matter - the market is full of those that will. But if U.K. gilts don't pay out, then a lot more people are screwed, and there's a lot more incentive to keep that cash cow mooing.


That's fine, give the extra money directly to the person consuming electricity, which allows them to choose whether to use that money to fund the energy use if they need to, while encouraging

My company has a lot of equipment it keeps on needless because it's cheaper than the manpower to turn it off and back on. If demand is lower at the weekend they could use the money to shift some of their electric use to a weekend and pay people more to use it then.

Instead we are encouraging people to use energy as before, which was slightly more profitable than another option, rather than taking the same money but putting the power into the hands of the people using it. It's a paternalistic socialist approach which disempowered the person buying on the market, it's the exact opposite of what the conservative party should be about.


> This may not sound bad, but it's the equivalent of 200 kWh per year and the set is only supposed to draw 0.5-1W from the specifications

Assumming you never use TV (if I calculated correctly it's 219kWh for 24/7/365 standby 25W waste). Many people have switched on TV for 3-10 hours per day, heck I don't watch that much TV and I have it switched from 8-11 PM plus my kids around one hour a day. I'm sure my retired mother would easily do 8-10 hours per day.

Also I mean if you are bothered with it just unplug TV or switch off the socket.

My whole household yearly consumption including AC (26C) is ~1200 kWh.


Especially since an android phone can idle for days on about 10Wh of battery, or something like 0.2W.


I have a 2020 model I bought from Costco and as slow and shitty the UI is it also still burns 16W in standby.

There isn't a way to disable Chromecast either so thanks for nothing Sony and Google.

I need a dumb tv and Nvidia sheild.

Edit: I take the standby power usage part back. After settings changes including Eco stuff and something to do with remote start and a restart it's at 4W which is very acceptable.

The UI inexcusably still sucks though.


Go to Apps, “See all apps”, scroll down, find “Chromecast built-in” and then click Disable on the app’s settings.

At least that is how it’s on my Sony TV.


Thanks, I am already down to 4W after Eco mode, disabling remote start and a restart -I sometimes use the Chromecast so going to leave it on.


You probably also need to turn off Network Power-On and IP control to prevent it from having network standby enabled. It's in different menus on my Sony.


4W is still pretty bad. Any modern computer will gladly sit under 1W ready for a quick wake up.


Recently started monitoring all my appliances. 1W is what I also found to be the numbert that some reasonable devices (PC sleep, TV off) sit at. Although my subwoofer takes 5W at idle, my PC monitor takes 5W at idle, etc. Looking into some Zigbee smart sockets for my home setup now.


There’s a dumb tv or pro mode on Sony tvs. Never used it though.


Thanks, I did not know this existed! You saved me money, here's a virtual beverage :)


Wow, never knew about that.

I have manually disabled most services on the TV manually, so not sure what extra it buys, since I do use some in-built apps for streaming as opposed to using the Nvidia shield, but it's good to know it exists.


Yeah my Sony tv from about that same year is the worst TV i've ever owned by far. Theres crazy bugs like when you leave a show or youtube paused it will just randomly start playing again after 10 minutes. The UI is very slow and unresponsive. If you smash the up/down volume too fast it disconnects from the soundbar. The remote is like something from the 1980s with 1000 buttons on it.

By comparison my Samsung TV is an absolute joy to use. The UI is responsive, works like you expect it to, The remote is wonderful. It really amazes me how Sony was a leader in the 80s and 90s then ate the dirt hard after the iPod and now 15+ years later they still haven't reorganized and got their act together to be a leader in anything except maybe camera sensors for smartphones.


I have a 2019 model and have non of the issues you have. Except sometimes the soundbar doesn’t want to connect when I turn it on.

We use an Apple TV and non of the built in streaming apps or features. No complaints and I’ve never even connected it to the internet.


Yeah this is what i've done, I ended up turning it into a dumb TV and hooked an Apple TV up to it to solve all the problems including the crappy remote. On my Samsung, I didn't have to do that as its more than good enough already.


The feature never even worked on mine, every time I tried to use it Chrome complained its version was too old or something similarly stupid excuse why it wouldnt work.


If you're wondering how much energy is consumed by your electronics, a base model kill-a-watt for about $30 will show you watts and cumulative watt-hours. It's a fantastic tool that pays for itself besides giving an eye-opening, interesting experience. You'd be surprised at the things in your house consuming electricity 24/7.

Almost everything in my house is either on power strip surge protectors or just unplugged.


I highly recommend the Athom smart home plugs preflashed with tasmota. No flipping cloud. None. Pick them up a fair bit cheaper than $30. And of course can turn them on and off over you local wifi - set timers over your wifi, or via mqtt for the whole smart home thing etc.

Point is they just work too if you don't want to do any of that. Plug them in, connect to their builtin wifi ap, read the power usage stats. Good button on the side that turns the power on and off too.

https://athom.aliexpress.com/store/group/Tasmota/5790427_517...

Esp8266 and a relay. Simple. Effective. Cheap. And just better than anything else I've seen in the space. Anyone has other recommendations I'm all ears.


How many watts does the smart plug use? :)


You can also get power strips that toggle power based on how much juice the "control" plug is pulling. I have mine set up with the TV- when the TV is off, all the other plugs are disabled. But when I turn on the TV, it turns on power for the Xbox, Steam Link, etc


> You'd be surprised at the things in your house consuming electricity 24/7.

I was ready for that. But it turned out that besides an idling rPI, modem, router and APs, nothing actually does. I have a bunch of zigbee connected smart plugs, and when I got those, I checked all my devices. Turned out regulations really do help, because all the standby usage was miniscule.


I’d actually recommend a wifi smart plug with energy monitoring

Easier to read the display on your phone ;)


Yes, get a cheap IoT plug supported by Tasmota and enjoy monitoring, toggling and time-based schedules without "phone home" features :-)


(: guestimates are good enough, everything pales in comparison to the freezer, turn that off and the power bill goes down


Yea, but it keeps my food cool and thus edible for several days longer than it otherwise would. Power usage of a device isn't the problem, power used not doing useful work for me is.


Nah they're really not. My freezer is basically nothing because it only gets opened once every few days. You might need a new freezer.


Actually a freezer only averages like 25-50 watts. It pulls a lot of power when the compressor is running, but the compressor usually isn’t running.


Newer ones are going the way of inverter/variable drive and “always on”, but varying their duty cycle. It’s the future!

A nice bonus is they soft-start (lower breaker trip risk) and no start capacitor to go bad. But more other electronics to fail.


Are freezers really that bad? Does it depend if they're chest or standup?

I don't remember my bill going up when I added a freezer, I didn't really measure though.


A chest is much better. The reason is intuitive: if you open it the cold air remains inside except for a little turbulence at the top and, of course, the new air that pours in to replace whatever you’ve just removed.

Whereas, when you open an upright one, cold air immediately starts pouring out the bottom of the opening, with warmer air coming in at the top to replace it.


Your energy use is going to be much more impacted by how full you keep your freezer. For the same reason.. the more product you have in there, the less air you have in there.

Your chest freezer will typically use about the same about of yearly kWh as a standing freezer would, but the chest freezer will typically have about 30% more capacity.

So.. it's "better," but if you're not going to keep it full, you may not actually be saving any total energy.


The thermal mass of the air is tiny - cooling it is negligible.

Chest freezers have better insulation and door seals, which is why they're more efficient.


> The thermal mass of the air is tiny

> Chest freezers have better [...] door seals

Hmm.

Is it outlandish to think that a modern standing freezer with working seals will let less air out the seals than is exchanged by opening the door even a couple of times a week?

Modern standing freezers are starting to get drawers with sides in them now, presumably for reducing air loss while it's open. And seals are practically air-tight (if you've lived in a place with high humidity, even tiny gaps or holes in the seal causes noticeable condensation and water build up. So I'm skeptical about the seal claim. Better insulation may be true.


> a modern standing freezer with working seals will let less air out the seals than is exchanged by opening the door

I think the idea is that the seals can conduct heat, and do so much better than whatever abitrarily-sophisticated thermal isolation is in the walls of the freezer, not that the airflow through the seals is carrying any significant amount of heat.


The chest freezers I've seen have very similar seals to standing freezers, are you saying they're significantly better?

In ether case modern doors build out an insulating section around the seal, and the seal itself creates a second seal against that so it's air gapped inside the freezer as well. The seal itself of course is a hollow tube which is somewhat of an air gap with the outside too. They're pretty good, I would be surprised for their small surface area if they caused a huge conductive loss.


Chest freezers do have much thicker insulation on all walls (as they aren't constrained by the standard widths (24 to 40 in in the US, I think? 60-90 cm in the EU).

Chest freezer seals are far better than upright freezer seals because:

* The length of the seal is smaller compared to volume,

* The height of the seal is not constrained by the closing magnets (as mechanical latches are not legal on consumer freezers),

* The temperature differential at the top of the freezer is smallest.

The specific heat capacity of air at constant pressure is ~1 kJ/kg K and 1 m^3 air has a mass of ~1.2 kg, so 1 kJ/kg K * 1.2 kg = 1.1 kJ/K. Even a freezer that exchanges 2 m^3 of air for outside air at say a 70K differential (the difference between -40C and a 30C room) for each opening only has to remove 1.1kJ/K * 2 * 70K = 154kJ, or 0.05 kWh, which is the standard unit of electricity.

An unit of residential electricity in the most expensive US State (Hawaii) is 41.2c / kWh, so opening our enormous freezer in the worst possible conditions and completely exchanging the air using the most expensive electricity costs us 2¢. It's a complete non-factor.

Source for prices: https://www.energybot.com/electricity-rates-by-state.html


What's the energy difference in the seals then?


> are you saying they're significantly better?

closewith (https://news.ycombinator.com/item?id=32890571) is saying that:

> Chest freezers have better insulation and door seals

I'm not claiming anything either way, just pointing out that the seals are for sealing heat, not (primarily) air.


Modern freezers really aren't that bad. Chest freezers being better than standing, because less of the cold air escapes when it is opened.

The US requires all consumer appliances to have an Energy Guide sticker (the yellow and black thing) which estimates an annual KWh use and cost at average electric prices. My cheapo 4cuft chest freezer uses about 200 KWh annually, or about $25. Not nothing, but $2 a month isn't going to register compared to most things.


I recommend Eve Energy, a bit pricey but works with HomeKit/Matter and 100% cloud-less. (No accounts, no remote access unless you have a HomeKit hub device, etc.) Also generates some nice detailed graphs and exports to CSV.


I got a bunch of TP-Link Kasa devices of that sort. Awesome.


For sure!

I found a pioneer amp that sucked 200W sitting there doing nothing. No increase when volume was turned up either. I compared to an older 1970s model that started at 3w and went up from there with the volume.


That's a difference between class A and B amplifiers. The former draw the same power all the time but are very linear, the latter don't consume much power at idle in exchange for small harmonic distortion.


Some libraries have meters you can check out.


Good suggestion! I just learned Oakland Public Library lends many types of tools: https://oaklandlibrary.org/otll/tool-list/.


My one misgiving with hard switching all devices daily is that they all have some kind of AC to DC power conversion with the cheapest capacitors imaginable that are subjected to high inrush currents every time I toggle them on.


A generic plug consumption meter is around $10 on eBay


I have a close spiritual cousin of one of those, and despite even touting a power factor display it tends to get confused by low standby loads and show nonsensically high power usage instead.


Chromecasts in general seem like big power hogs... I don't need you to use 20 watts all the time to show a shiny background image and a clock...

I really wish users who spend "Just" $/€/£ 40 on a Chromecast knew that it is going to cost them the same amount every year on their bill. I think many of those people wouldn't buy one if they realised they are almost buying a subscription.

Google could adjust them to just don't send out any signal unless someone is actually casting. The chipset can totally do 0.1 watt sleep mode with wifi connected.


> I don't need you to use 20 watts all the time to show a shiny background image and a clock...

Even the latest Chromecast (4K with Google TV) requires a 7.5W power supply [1], it can't possibly draw 20W. Also it looks like it idles at around 1W [2].

[1] https://support.google.com/chromecast/answer/3046409

[2] https://www.techspot.com/news/95809-streaming-media-player-p...


I just measured my Chromecast Ultra (earlier 4K model) at a wall power idle of 3.2 to 3.8 watts. Explains why it's always toasty. The Switch while charging two controllers, the TV, and the speaker bar idle at a total of 6.2 watts.

I wish the Switch would turn off rather than use its battery when I kill power to the cradle. At that point I could probably develop the habit of hitting the power strip switch.


It was a big eye opener for me when I realised that the Chromecast is always warm, even when the TV is off and it's not been in use for days. The extension switch is now always off, but I wonder how much this device cost me over the past years.


> Chromecasts in general seem like big power hogs...

The original Chromecast product was a stick that ran off a default 10W USB 2 power supply. I don't know what the current ones are drawing, but I'd be pretty surprised if it was any more.

This is a bug with this particular TV, which is running the Chromecast stack on a much less efficient SOC and clearly has some kind of integration bug which is preventing low power idle states. It's got nothing to do with Chromecast (or Android) as a product or protocol. They should fix the bug.


The original required 1A (5W) and the current one is 1.5A (7.5W).


I kinda doubt they use 20w.

I doubt my TV’s USB port pushes out 4A (@5V) and 20w in a heat-sink less enclosure like that would bake.


For comparison, my external Nvidia shield (2019) consumes around 3W in standby and 8W max.

https://www.guru3d.com/articles_pages/nvidia_shield_android_...


And this has built in Chromecast.

I love my Shield.


I love that my home screen is now jammed full of bullshit and it cannot be used without a Google account and if you use your personal Google account Nvidia needs a ton of permissions.

Other than that I actually do the love the little device but I feel more and more jerked around when I shell out only to have junk foisted on me later.


All the ads on the home screen suck now. When I first got my shield I made another account just for it because I didn't want it connected with the rest of my google stuff.


I believe the ads are built into the launcher and can be removed by using adb to replace the launcher app.


I thought about getting a Shield. But only as a low maintenance Plex Server appliance. I wouldn’t voluntarily look at any interface on my TV designed by an ad company.

I have a few Roku TVs around my house and I recommend them to most people. But I also have AppleTVs connected to my most used TVs.


Rasplex?


Rasplex is a Plex client not a server.

Installing a Plex server involves these steps on a Raspberry

https://pimylifeup.com/raspberry-pi-plex-media-player/

As opposed to just going to the Google Play store and installing it on the Shield a getting updates.

Besides the Raspberry is going to struggle transcoding, the Shield supports hardware transcoding for Plex.


Apologies for the bad idea. I've grown used to people not properly differentiating between server and client use cases.


Their customer service is awful. You have to downgrade the Chromecast built-in on all new models to get it working with certain apps, and there's nothing on the forums about it.

Hulu Live TV app is broken, forums are full of posts, absolutely nothing Nvidia is doing about it besides posting broken English instructions to reformat the device.

My friend sold me hard on the Nvidia Shield PRO and I went with two to replace my house's Rokus. Two weeks into all the problems, I reconnected the Roku for all the streaming services, the Shield is for Plex only.

Pretty disappointed.


has anyone used these shields to convert old VHS tapes to digital?


Ok so... Why would you ask this question? Can you convert them using another chromecasted device?


because one of their claims is to have a much better upscaler than the average bear.

https://www.nvidia.com/en-us/shield/support/shield-tv/ai-ups...


Just checked my Sony Android TV from 2019 and found it is drawing 16W while turned off. Turning off the Chromecast built-in service, the remote start service, and enabling Eco mode did not seem to make a difference.

I already disliked this TV and was unlikely to buy another from Sony (for example occasionally it will “crash” while in standby mode and stop responding to any input, requiring me to unplug it and plug it back in). But learning this further damages my opinion of both Sony and Google.


It takes a minute or two to settle down

If that still fails, try disabling wifi and disconnect the ethernet


I've got a sony google bravia and have disabled everything I can think of, chromecast, remote start, bluetooth etc. Cannot get it under 12W when switched off. Can't believe I got suckered like that. There really should be consequences. Nobody would reasonably think their tv is sucking juice like that when turned off.


Don't forget to check power usage after a few minutes. My old Panasonic TV from 2007 would switch to a pre-standby state initially, then to a "deep standby" after about 60 seconds when the power usage dropped below 1 watt.

Regarding Sony, I found them the worst offender for standby power in my home stereo equipment which I've now replaced.


You could propably return it as defective. At least in EU there is a regulation about maximal standby power usage.


How would you even go about this though? What store will accept a return “because it consumes too much power when off?” They most likely don’t even have tools available to verify the claim.

Even if you ship it to Sony, when will you see your refund?


Yes, if you took it to the store you bought it from either with a print off of the relevant parts of thst thread, or my smart plugs also show power consumption, I could screen shot that and take it in.

I honestly don't think you'd have an issue at all doing that in the UK at any of the main highstreet electronic stores, or somewhere like John Lewis.


It could be worth setting it up on a controlled power outlet.


Could be a poor implementation of Chromecast standard when integrated into the TV's.

My Chromecast connected to my TV is also powered by a USB port on the TV. It seems to turn off when the TV is in standby, I have to wait for it to initialize when I turn my TV on. This detail could have been easily lost over the years.

I do agree with people who say the Chromecast was best when it first came out. It did one thing well (albeit in an indirect way). Like all hardware, it has been slowly adding more and more features that will require it to draw more power, connect to the internet more, track more usage, integrate with more Google services, etc.


That may be the TV not delivering power to the USB port when in standby

That’s how this Sony behaves


Television isn’t my primary choice for media, but when I do I just unplug it when I’m done. 0W standby. Same for basically everything other than the refrigerator and wifi.


Might be worth finding out whether you're wasting your time. As measured on a Kill-a-Watt, my 2013 LG TV consumes 0.0 watts when on standby.


If you do it right it's hardly a waste of time worth considering. My TV is plugged into an outlet that is controlled by a lightswitch, so I don't need to contort myself around furniture to reach to 'unplug' it. In other situations, I've used a power strip placed so the switch on the strip is easily accessible, sticking out from behind the cabinet. Even if the power draw on this TV is negligible, I think it's a habit worth maintaining because it costs me nothing and it will save me having to reacquire the habit in the future with new appliances. Also part of me feels reassured that technology has not yet rendered me too lazy to use a simple switch.


That is less effort that I was envisioning. But personally I prefer to take the stance of not inviting power vampires into my house.


You don’t want to do this with an OLED TV though.


Why? does turning off an oled TV cause problems?


Yes. OLEDs run a short matrix refresh cycle every few working hours and a long one every hundred or so after you put them to standby; you probably won’t break anything, but the TV quality will degrade quicker. (Not sure about the not breaking part, though ;) I leave mine powered on.)


It needs power or it degrades faster? Sounds like a great reason for me to never buy an OLED tv.


There are plenty of reasons to not get one and they don't matter since for me no other TV technology even remotely makes sense due to picture quality being so vastly superior than anything except plasma.


> vastly superior than anything except plasma.

One more reason for me to cherish my 2011 plasma TV. Also, it has none of that smart-tv non-sense, which is a big plus.


Since the topic is power draw, it is important to note that plasma TVs require much more power than LCD and OLED, at least when they are turned on. By how much is not trivial but it is usually at least double for a similar sized screen.

Of course, if your new TV is using 25W in standby, it doesn't really matter, but I am assuming something reasonable here.


I sold my 2010 panny plasma two years ago and picked up a Sony OLED and wow, what a difference. Once you turn off all the BS image processing and dial in the settings to your liking it blows plasma out of the water with brighter images with HDR, lower power draw and higher resolution. You're not forced to use the smart-tv non-sense either, don't wire it into your network, don't connect it to your wi-fi and only use a third-party device like a shield or roku.

I'd say you're missing out at this point IF you consume media regularly, especially movies. Otherwise you probably don't mind much.


> my 2011 plasma TV

My experience, having now bought several used plasmas from about that era, is that they tend to have extremely bright black points, comparable to a ~2015 mid-tier LCD screen. Fairly cheap modern LCDs seem to have superior image quality with sharper pixels as well.

Is this not your experience? Have I just been unlucky and purchased some aging panels? When I got into the market, I was of the impression that plasmas were renowned for their high contrast ratios, but I have been less than impressed.

Burn-in is extremely severe as well.


> Fairly cheap modern LCDs seem to have superior image quality with sharper pixels as well.

I personally don't like the "sharpness", I like the image to be more "fluid", with the colours more easily "blending in" into one another (don't know what's the exact technical term for that).

A few days ago I watched a movie at one of my friends' place, she has a fairly recent LCD screen. I couldn't shake off the feeling of "being in the same room" with the actors, there was almost no "cinematography" involved anymore. I didn't like that, made it quite difficult for me to raise that fourth wall. I know of the HDR on-off setting thingie, but I don't think it was only because of that.

I agree with the burning in, only that in my case is not that severe, for what it's worth. There's some of it in the upper-left corner, from when I used to leave the TV open almost non-stop to a shitty music channel, that music channel's logo has left its mark in there. It doesn't "show up" all the time, only when the screen up there is all "white"/light up, and only at times, not always. Either way, manageable.


> I couldn't shake off the feeling of "being in the same room" with the actors, there was almost no "cinematography" involved anymore.

that was most likely motion smoothing. for reasons unknown to anyone but TV OEMs it's enabled by default.


It's good for who care black quality, but I don't recommend it for average TV buyer due to short longevity.


Is that the case with all OLED displays? like a phone screen, etc.?


I don't want to do that with anything. It's ridiculous to have to go through the trouble of doing that.


I'm seeing videos and articles saying that leaving it on is to let the OLED pixels refresh every 4 hours. How important is it, really?

The pixel refresh thing doesn't seem to make any scientific sense to me as a layperson.


The name is just branding nonsense. The TV just measures the degradation of pixels when not in use and uses that information to compensate.


I power off my LG OLED via a power strip and it will bug me every so often to run the screen maintenance when I turn it back on


Why not?


OLEDs do pixel refreshing every once in a while while turned off, but they require power to do this obviously.


I've a 2015 model that's becoming harder to use because of the outdated software and I'm not sure of a fix. The panel still looks great.


Depending on what you do either get an Nvidia Shield or an AppleTV.

I'd recommend the Shield for Plex or the AppleTV for streaming Netflix, etc.


AppleTV is fully capable of playing local content. You can either use Infuse player or sideload Kodi


You can also install VLC for the Apple TV (it's on the App Store). It can play content from local network shares.


Yeah but it doesn't have HDMI audio passthrough which is an absolute deal breaker IMO.


Yeah I don't even bother with the built in "smarts". Turn off wifi and bluetooth, don't plug in ethernet, and hook up a decent streaming box like an Apple TV or Shield. With that, it doesn't matter how outdated the TV's software is.


Definitely don't get the Shield if you use the stream apps. Nvidia's support for them is awful and their forums provide no help.


Yeah...

Even basic stuff like framerate matching is still missing after all these years.

They added it in beta a couple of years ago as a manual setting from the quick settings menu, but it's super buggy. It has never worked properly for me. Either the video completely freezes or the audio gets completely out of sync.

And even if you manually set the framerate to 24Hz or 23.9 Hz, there are constant random frame drops.


In which way is it becoming harder to use? The smart TV part of it?


It's simple enough to say "use an exterior device" (which I do), but even input switching is managed in software and can be a nightmare sometimes. It would be great if there was just a light OS install. It'd likely bring the power use down as well.


If you don't use live TV, consider using an external HDMI switch (or a receiver) so that the TV can stay on a single input?


Can recommend the receiver route, at least if you also want to have a decent speaker setup (even just 2.1 stereo+sub). Receiver firmware is more more minimal and no-nonsense, plus most receivers have physical remote buttons for all inputs which makes switching faster.


My 'smart tv', which is used once a week, is plugged in to a light switch. Small improvement in privacy and in power consumption.


A modern phone with a 5 Ah ( ~ 18.5 Wh) battery can run for many hours streaming to a builtin screen, meaning it uses far less (anything over 45 minutes is more energy efficient) than a chromecast despite doing more. Same scenario with a laptop with a 60 Wh battery, which can also run a long time in idle, and even while streaming, and is able to resume from sleep in a second. Running more than 2.5 hours on battery means it's more energy efficient than a Chromecast.


Your smartphone does much less, not more.

When you are casting your smartphone on a built-in tv, the TV is doing all the work (downloading and decoding the signal). Your phone is just a fancy remote control.

The Chromecast is downloading the video and sending it to the TV on the contrary.


The person you're replying to is talking about playing the video directly on the phone, not casting.

An iPhone 14 Pro Max can play streamed video on the internal display for literally 25 hours on a single charge.

edit: the 14 Pro Max has a 17 Wh battery, which means it draws 0.7 W streaming and displaying video.


Misleading headline - it’s a discussion about one Sony TV, that was released in 2015.

Worth checking others, but the headline makes it sound like they all do this.


My sony xr-77a80j released in 2021 and at least sucks down 25W in Standby with Chromecast support on. I think it was more like 28W last I measured.


I can't reproduce these results with the 77 inch A80J.

Running the machine in standbye for 8 and a half hours and testing with a Kasa device consumed 32 wh of power.

According to the app I am using it would cost about 6 dollars a year to run this on standbye at .17 cents per Kilowatt hour.

That said I do not use Homekit. Someone in the Braviam forum said enabling Homekit and allowing Homekit to turn on the TV causes a huge power draw.


I have a different model of Sony TV that I bought in 2019 that consumes 16W in standby. Not as bad as 25W, but still far higher than I would expect any modern device to consume in standby.


From the article and the other HN comments, if they have Chromecast enabled by default they do.


I have every single device plugged in to a switchable extension cord for this reason. No need to trust a manufacturer, not even a 1-5W consumption (times the number of devices).


I also put everything except my fridge, freezer, and router on switchable extension cords. Why waste even a single watt-hour on things I don't use?


It feels like over optimization. The base load of my house is around 1kW which is still completely irrelevant in the scheme of how much owning the property costs. The literal cost you’re adding as well as the mental cost of having to switch things on and off all the time shouldn’t be ignored. There’s a timer on the outlets for the soldering gear though, that adds a bit of peace of mind.


In the PG&E service territory in California, that base load will run you about $5500 annually. That's more than the property tax on my townhouse.


You guys are over $0.60/kWh?! That’s 3x what we’re paying in MA (known for “not cheap” electricity).

That would make an EV much less attractive as well.



Had a contract this summer where a kWh was about €0,80.


Thays 50x the base load of my house, and would cost thousands a year to run idle. With current costs in the UK, close to $10,000


You don't have a refrigerator?


I have a fridge freezer, and last time the compressor kicked in my kill-a-watt meter says it used about 400w for about 5 minutes. It does that maybe 2-3 times a day.


I'd love to get something that efficient. What model is it?


Came with the house, but it's a Samsung that looks vaguely something like this [0], at a guess bought in 2019 as that's when the previous owners replaced the kitchen.

[0] https://ao.com/product/rb36t672csa-samsung-rb7300t-fridge-fr...


Is heating/hot water in that 1.5kW?


Heating and hot water is gas, cooling is electric.


Amazing. That's about a factor 4 more than average total consumption around here. Which I am beating by almost a factor 2.


I’m not sure how people run such low power usages honestly. My desktop is about 700W when playing games alone.


It's the obvious, but we turn things off/disconnect when not in use, and we buy the most power efficient versions of devices whenever we replace anything.

My desktop with its Titan Xp is the exception to the rule, but it isn't on much these days.


Sure enough my newer (2018?) TV was doing this. Dug into the settings and turned on eco mode. Idling around 0.5 to 1 watt now.

Probably time I go around looking at whats eating up power.


If you run an always on server on every gadget you have at home, don't be surprised if you energy bill starts looking like a datacenter.


I would imagine most people don't go behind their TV hook up their little HDMI dongle every time they want to use it, so it seems reasonable to have a comparable wattage drawn when it's built-in -- but this is significantly higher than what it looks like an idle Chromecast draws (1.8-2W). Why is it so much more?


No, I put mine on a power block with an off switch. When not using that block (most of the time), it should draw 0.0W.


I own a Sony A80J and this is also what I was measuring. I'm thankful I did measure this before just blindly turning the feature on.

I have a Bose Soundbar also with Chromecast and that one doesn't suck down 25W+ while on standby and can thankfully turn on the TV via HDMI CEC.

Sadly the whole OS experience is shitty on the TV. It's 2500€+ TV and the UI is sluggish, Tv takes at least 5-10 Seconds to turn on each time and still comes with the classic google ad like interface.

I bought a Chromecast remote that works via bluetooth to somewhat make the remote feel a bit faster, but it sadly just makes a tiny difference.

Nvidia Shield still feels like the only real option when you want a non sluggish Interface on your multi-thousand euro/dollar purchase.


Where did you find the option to disable Chromecast on A80J?


I don't remember exactly but it isn't exactly named something related to Chromecast.

It was something about Waking the TV.


I have a Sony OLED TV from 2020 with Android TV and Chromecast, and power usage on my UPS is reporting as 0W when off (probably rounding down). I never had to disable it, but I also never setup network connectivity or the Android TV features in the first place.


Doing the math… this would cost me $50 per year in electricity bill


My new Philips dumb TV just arrived this weekend and I must say I'm impressed with the power draw. In default calibration, it draws 24w. But since it's in the bedroom and is only used at night, I've set it to ultra-eco mode and lowered the brightness for night viewing. Now the TV along an Amazon Fire TV stick powered by its USB port draw a combined 13w! In standby mode, the power draw doesn't even register on my meter.


Can you link the model you purchased? I’m in the market for a new TV and Philips is literally in my backyard.


Philips 32PHS5507/12: https://www.amazon.de/gp/product/B09ZHVSVYP

For a larger TV (like 50-60 inches) a good on-paper power consumption amount would be between 50 and 70w. You can lower that drastically if the TV has an eco mode and is a dumb TV. In Europe they have to provide a "Product fiche" that lists a number of things including power draw, and an overall "grade". TVs basically never go below grade E because the rules are far too strict.

Oh, also the power LED on this TV is SUPER ANNOYING! I had to open the TV (just a bunch of philips screws), pull out the IR receiver circuit board, and cover the LED with electrical tape (the LED, not the IR receiver).


Ooof 768p is a dealbreaker for me unfortunately. Especially since it is meant to be used in a smaller room where you may be sitting closer.

While I have your attention, are the energy ratings seen on electronic devices generally a big jump in real world usage between the various tiers? I just moved to the EU from the U.S. so all of this is very new to me.

Thanks!


For TVs the jumps between grades isn't that big TBH, like maybe on the order of 10-15w per grade. You can get a feel for it by searching for TVs on one of the European amazon sites and clicking on the product fiche link for each one.

My living room TV consumes 70w (as measured by me), which is not the best, but considering how often it's actually turned on it doesn't make enough of a difference. 10w of power draw equates to 3.5kwh per year if the TV is only on for an average of an hour per day, which would cost me a little over 1 euro. The reason why I replaced the bedroom TV was because it didn't have a shutoff timer, so it would stay on for days if I didn't notice, drawing 37w the whole time (325kwh, or 115 euros per year if it stayed on the whole year).

The biggest ticket items for energy efficient replacement are any appliances that stay on long term (fridge), your oven if you use it a lot (get an air fryer or a combined microwave/grill), any computers that stay on all day, and anything with bad standby consumption (get an energy meter like this one: https://www.amazon.de/gp/product/B00MHNGWDM).

Also, if you need additional electrical heating in the winter, an oil heater is the most efficient. DeLonghi makes the best ones, for example https://www.amazon.de/-/en/DeLonghi-TRRS0920-electric-radiat...

And never underestimate the savings from good insulation!


Thank you for the detailed write up!

Luckily, I've moved into a brand new apartment so the energy efficiency is really great. The heating is in-floor radiant using the same gas boiler as the hot water in the apartment.

Unfortunately, I apparently picked the worst country to move to when it comes to energy prices in the midst of this war. The Netherlands is absolutely screwing the population by not capping gas/electric prices. Hopefully things stabilize before the winter really sets in or I think many people will be driven to bankruptcy just to pay the electricity bill. My current rate is almost €1.10/kWh. When compared to the paltry 9¢ I was paying in the US, I about had a heart attack when I saw that rate.

I'm trying to limit cooking and hot water, but only time will tell if my efforts are going to make a difference. In the meantime, I'm going down the street to the library to use their facilities and charge my laptop and download movies. I've only just arrived so I don't have any internet setup at home yet. I'm certainly going to have an interesting few months ahead!


This looks similar - same features list, same standby power consumption, higher running power but not bad, no mention of Android TV, 1080p: "Philips 43PFS5507/12 43 Zoll LED Fernseher Für Kleinere Räume"[0]

It so well answers what I want when the seldom-used 2010 55" Sony Bravia mounted above the piano goes that I'm considering buying one right now while Phillips is still selling them.

Thanks for pointing me in this direction, kstenerud!

[0] https://www.amazon.de/Philips-43PFS5507-12-Fernseher-TV/dp/B...


Nice! 1080p is feasible. I would like 4K, but I haven't found any that seem to exclude smart features with that resolution. Likely because all of these units are old-new stock that hasn't sold in the past 5+ years.

I might just need to do some work to find units that respect turning off the smart settings and don't become severely handicapped in the process. A European brand like Philips probably fits that bill better than most. I've heard horror stories about Samsung and Sony TVs attempting to phone home by any means necessary.


One important thing to know about the EU energy ratings: there is a newer standard since 2021, and an appliance that was “A++” in the older standard might be “G” in the new, but the old rating is still on display because it wasn’t retested on the new.

An image with the old and new labels side by side: https://ec.europa.eu/commission/presscorner/detail/en/ip_21_...


Is there a reliable device certification that I should look for when buying electronics?

Also, after reading this thread I decided to invest into a power consumption metering device.


Chromecast Ultra also uses 20W in standby.

I was going through the all devices and found that a TV which is most of the time off, was using that much power. Now I keep it unplugged.


The chromecast ultra comes with a 5w power supply (I know, I have one and just checked) so this cannot be true.

I don't know about you but I literally don't care about 5w, it's getting converted to heat anyway just a bit less efficiently than my gas boiler.



I bought a 55” LG 4k tv in 2018 and I am so glad that I bought it before all this always-on ads-infused nonsense became integrated.

It works great and I hope it lasts a long time.


I found something like that on my Sony STR-DN1080 receiver as well - I bought it here in EU and found out that it can't be powered on via AirPlay/Chromecast/Spotify like Denons or Maranzes can.

Why? Because it apparently uses 30-35W (!!!) while being in so-called "Network standby" mode which was not legal per EU regulation. The US/Asia models happily gulp down that much power by default though.


Whenever I read something like this I am wondering if there was still some TV without any of these „smart“ features. Just a screen and the receiver for television. Then I can connect an Apple TV, Xbox, PlayStation or whatever. Isn’t there a demand for that? I mean I don’t need another device which breaks or is unsafe to use after 2 years because it doesn’t get any more updates.


I have 2 Sony Android TVs and never connected them to the internet in 5+ years. I use it just as you described, the Apple TV drives everything. No updates in 5 years and not a single issue.


I have a Sony Bravia from the 2010s (a dumb one) that has a standby mode, engaged when you power off the TV via remote.

The TV also turns off-off when you use the physical button, but there's seemingly no functional difference, the unit still powers on via remote if fully off.

I have a feeling that all the physical power button really does is put the unit on standby, but also turn off the little red standby LED.


> the unit still powers on via remote if fully off.

Then it's not fully off.

> I have a feeling that all the physical power button really does is put the unit on standby, but also turn off the little red standby LED.

Sounds about right.


How is that even possible? A tablet at full power doesn't use that much. That's way above and beyond what you'd expect even a laptop(Or at least one I've owned recently) to do.

I don't even think something like a RasPi 4 physically can pull that much.

Is this chip shortage related and they just couldn't find some regulator or something and used linear?


It's not chip shortage related, the TV models being discussed are from 2015-16.


Vizio TVs do the same and use about the same 25W. You have to turn on "eco mode" in the settings which increases boot time.


It's really a problem with how Google Chromecast works, it requires the TV to be in a powered-on state.

You can just turn it off.


I even have a separate chromecast connected to my Sony tv. The builtin doesn't appear until the TV has booted up and is nicely ready for use. While the original chrome cast ultra I can reach no matter what, and casting to it starts the TV up.


What does it need to use 25W for? Is the CPU running in a busy-wait loop for some reason?


I like my dumb monitor that doesn't have cable or even DTV (https://en.wikipedia.org/wiki/Digital_television)...


I found it's simplest to just put the power through a IoT smart plug and tell Google/Alexa/Siri to turn it/them off/on when I want it. Standby draw on those things is very low, 1-2Wh.


Chromecast doesn't have a power sleep mode right? I e always wondered why especially since it can CEC power on when asked to. Maybe there's a device that I can use to proxy between Chromecast and TV to do this?


I use the chromecast to wake up the TV via CEC. Just starting to play something will turn on the TV.

If only it could turn off the TV instead of showing the wallpapers it would be perfect.


Bought a mid tier bravia from best buy and loved it at first, but the ui has become so slow and is bombarded with ads. Also cannot disable featured ads and app suggestions/channels from Sony. I hate it.


It's really to the point now where we need to ban "smart" TVs, since every single one of the manufacturers is demonstrably unable to handle the responsibility of engineering these things correctly. Sony can't do software, Samsung can't, Google as a proxy is even worse than letting Sony do it, TCL can't, Apple doesn't make a TV. Nobody can handle it. And it's bad for consumers even IF they could do it competently, since TVs need to last a decade and none of this software is decently reliable for a decade's worth of upgrades.

Fucking ban it and be done with it. If the Euros are going to force Apple to adopt USB-C, which is idiotic, then we can do this, too, which would actually do some good.


My TV (and everything connected to it) is connected to a power strip which is connected to a HomeKit switchable outlet. When I'm not watching TV the entire power strip is simply turned off.

Standby power: 0W


How much power does the switchable outlet itself draw?


That's a good question. I've never measured it... until right now. According to my plug-in power meter, half a watt. Which I think is the lowest amount the meter can display.


How much does an Apple TV 4k 2nd gen use? My 4k first gen turned off when I turned off the TV. The 2nd gen does not. I assume it has something to do with the iot support



My 2nd gen idle at 1.2 W

Active use from 3-6 W


Yet another reason to ignore the "smarts" of your smart TV and use an external box.


Just for my understanding: What is the power consumption of a Apple TV in idle mode? and of a Raspberry Pi in idle mode?


According to this article,.3W at idle and up to 5.58W when streaming (both measured at the socket): https://www.techspot.com/news/95809-streaming-media-player-p...


idle as in OS not running is around 2.5W for rPi. No real power management on rpi either so you can't sleep.


Just measured my RPi Zero 2, it took 0.5-1W without HDMI after bootup.


It waste your time and even more power when you watch it:)


Horrible site. Twice bothered me to login and then went gray on gray, very hard to read


Never bothered me to login. FF 104 with ublock origin. Black on white.

Relevant posts:

> I measured the standby power consumption on my set, and it uses an extreme 23 W, or 200 kWh per year For context, this would be approx 10% of the average consumption of a German...

> My Apple TV uses 1.2 W in standby

> Have anyone else measured their set and seen similar figures? Any way to lower it? > I tried setting Eco to high, and that lowered it by a whopping 0.5 W in standby to 22.5...

> edit: According to the specs, it's supposed to draw 0.5 W in standby? What is going on? > https://www.sony.com/electronics/support/televisions-project...

> Thanks, I had already turned off remote start (it lowered standby by 0.5 W). Remote Device Settings did not affect it much either

> I reset the TV to factory settings, set it up without logging in to Google, and declining all privacy policies and "extra" services. Still idling around 23 W

> After some trial and error, disconnecting the ethernet cable seems to be the trick. Then the TV is able to idle at 0.5 W after a few minutes

> Something must be terribly wrong with the software though, that keeps the SoC active if it has internet connectivity? I found in the Norwegian specs that "network standby" should draw only 1 W

> I'm no longer using the smart features after I upgraded to an Apple TV last year, so in principle this is not a problem for me to lose Chromecast, but it is quite irresponsible of Sony to have such buggy firmware

[EDIT: I forgot to include the concluding post:]

> Indeed, I disabled Chromecast from the app menu and then the SoC is powering down to the 0.5-1W levels after a minute or so

> So to summarize: > If the TV has a network connection and Chromecast is enabled, the SoC is not powering down such that the set idles at roughly 25W instead of the 0.5-1W per the specifications

> This is not good if it applies to more people than me, as each set would consume about 200 kWh per year in standby

> German households consume 130 TWh electricity per year, or 1500 kWh per person. So this is sizeable in an European setting, and would help tremendously with the electricity saving targets that may come this winter due to the war

> Now, how to get the attention of Sony and Google to fix this? This may even be in breach of the ecodesign guidelines!

by eisa01


Honestly, I can't give a fuck, dude. We have unlimited clean energy at our fingertips. If you want me to give a fuck about this, first let go of that. I'm for energy production maximization so that we can stop worrying about whether this is a problem.


What about cost? Having this TV running idle 24/7 would cost about £150 a year with current rates, and that’s only due to go up


That's just free money, right? I find that convincing. It's not much, but it's just burned for no utility. Cool. Consider me convinced.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: