This may not sound bad, but it's the equivalent of 200 kWh per year and the set is only supposed to draw 0.5-1W from the specifications
This would cost more than 50 EUR per year at current prices, and is sizeable compared to the average German citizens consumption of 1500 kWh electricity
Let's just say this is not ideal given the current energy shortage in Europe...
Modern appliances are increasingly connected to the internet during standby mode, consuming higher amounts of energy. This includes networked televisions and decoders, printers, game consoles and modems.
- Specific requirements for network-connected standby devices were introduced in 2013.
- Since January 2017 networked standby devices must not consume more than 3 to 12 Watts depending on the product.
This compares to 20 to 80 Watts previously. This decrease is expected to save an additional 36-38 TWh.”
I have a Sony android tv and with almost every setting (eg. brightness, motion blurr processing, etc) I change I get a warning it might “increase power consumption”. So it might just be it is testen under the bare minimum functionality. But as soon as you enable any of the advertised features it exceeds that.
For the record mine mostly has 1-2W standby consumption. With some spikes to 10 or 20W for a few seconds sometimes. And once in a while it never enters deep sleep and gets stuck on 20W. Enabling HomeKit (Apple TV) on standby is a sure-fire way to have it standby on 20W permanently. I can’t disable it after unless with a factory reset. I’ve not touched Chromecast settings at all so can’t say if it has the same effect.
There were devices that consumed 80 watts in standby. I have a cabinet full of equipment including my cable modem, a computer (server, no monitor) with 8 mechanical hard drives. The idle current is 76 watts or so. How could something be so high?
Cable modems and servers weren’t designed for standby.
Also, if we’re talking consumer tech from the 1970’s or 1980’s, I think it’s twofold.
Firstly, the market cared more about “time to wake up” than about power usage. People had fewer devices, they were (relatively) more expensive, and people were used to fully switching off stuff, so standby power usage wasn’t seen as a big issue.
Secondly, the tech itself didn’t exist. Nowadays, your power supply has a CPU that knows how much power your hardware needs and can optimize for it. Then, you would be lucky if your hardware had a switching power supply.
CPUs didn’t have power states. If you were lucky, you could halt them and have them start again on interrupt. Hibernation didn’t exist, so in computers you’d have to power RAM, anyways. And because of that, OSes didn’t bother with the complex task of properly powering down peripherals when going on standby.
In the 80s, your VCR would frequently be plugged in, and it would keep pumping out a placeholder blue screen at a minimum, particularly if connected via antenna passthrough.
However startup time would be minimal - a couple of seconds - compared to warmup for the CRT, which would usually be at least 10 seconds before it would look settled. But I recall devices generally being much faster to start up. A C64 would boot to READY prompt in 3 seconds. The penalty for turning off completely and needing to wait for boot was small, and for computing devices, wait time was dominated by loading, not booting. You might have to wait for a rewind with a VCR, and you'd certainly wait for tape loading on devices in the C64 class. Consoles with cartridges didn't take long to boot up and load though.
I had a DirecTV satellite receiver with DVR. They rolled out an update and made a big deal their new "Energy Star ECO power savings mode."
I wondered how much power it could really save because it was a DVR with a spinning hard drive that would always record the last 30 minutes of the current channel even if you didn't set it to record anything explicitly.
So I measured it while on and also in "power savings mode" where the video output was off.
14.5W when on and 14.0W when in "power savings mode"
the screen is the primary consumer of energy. i don’t know but imagine sound is likely more hungry in that regard than any of the other discrete systems on a smart tv.
edit: additionally tvs that don’t turn off are in some cases of limited utility.
Not tellies but an interesting look by Dave Jones[of EEVBLOG fame] into how/why simple devices can use up so much power when idle. A lot of it is down to lazy design of the AC>DC circuits
Remember in europe we have a lot more old stuff - especially old houses that are family owned for 100+ years tend not to have things replaced often unless they're unrepairable.
Even modern class D amps can be 30+ watts when not playing anything tho.
You mean idle "server turned on but not used" or completely off ?
Servers usually have out of bands management builtin and that's on all the time, even if management network is not connected. Should be just few watts tho
I hope you realize both of your prices are incredibly cheap.
New contracts in the Netherlands are up to €0.88 per kWh (edit: seems i underestimated, 0.88 is the average, the most expensive is €1.14 per kWh now.)
At least with the electricity prices in Germany (not referring to the current price level, but with reference to the last decades), a dedicated source of heat (oil, gas, pellets, heat pump, district heating, …) is normally more cost-efficient than simple resistive heating.
So while it's true that that standby consumption is at least good for something in winter, a lower standby consumption and in turn turning up the central heating a little more would normally still be the better choice.
Very few people heat their homes with electricity. It's the most expensive energy source. So while the heat is atleast useful in the winter, most people would prefer to take the TV idle spend and apply that to a much cheaper heat source.
Wow, that’s worse than anything I’ve heard about around here (northern Bavaria). Power strips with switches are a sound economy move, in either situation.
Yeah, energy costs are just shy of 4x what they were for us last summer. All the sockets in our house have switches on the wall to isolate them. The real trick is just knowing which ones are worth turning off!
Doesn't matter in the UK, there's no incentive to turn off the set to save yourself £20 a month, because the taxpayer will subsidise your excess electric use thanks to the latest idiotic socialist idea to simply get future generations to pay for today's excess use.
I am childless and atheist so I don't particularly give a shit about future generations.
Those who do have a vested interest in the future of mankind are welcome to donate their money to the State.
*for non-uk: the Tories are The Conservative Party, considered right wing, and certainly not 'socialist' by any stretch.
Reading any UK newspaper of late, I would consider a £20pcm saving tip almost front-pageworthy.
I also believe Viz might accept it in their Top-tips section, if you want to reach a wider audience.
Providing help to commercial consumers is essential, look at the energy usage last year and base it on that.
However just paying the suppliers directly gives no incentive to invest in increasing energy efficiency - if I'm paying 70p a unit and can spend £1000 to save 3000 units it makes sense. If the government is capping it at 20p a unit then why would I bother paying to increase the efficiency? That means more needless energy use
I don't necessarily disagree with using increasing efficiency as a vehicle for survival in this European apocalypse but you need to consider proportionality/scale of the problem.
In less than two years, the price of the usage of public grid electricity will have increased in the U.K. from ~£1,600 to £6,500. These useless household statistics only help to hide the fact that businesses, who do things with this electricity that don't include running a T.V. pointlessly for 8 hours a day, or heating an unoccupied room, are going to suffer more than anyone. It's easy to take the British consumer's word for it, and understand that a bill which served as a relatively small expense in a past life will now be 4x bigger than it was before, but the reality is that doing things is significantly less affordable. Manufacturing, hospitality, healthcare, public and private services etc. will all be smashed to bits, and their utilities bills aren't going up 300% but more like 500-1500%.
Is it remotely possible to push energy efficiency to the point that we are using half of the energy we were before? It's entirely possible, in my mind. With trillions of dollars of investment.
It's simply not worth it. It's better to take the money and run, if daddy warbucks is offering to give us 3.9% APR on a couple hundred billion dollar loans, than it is to pay in cash and pray for better times with improvements that come along with science and technology.
Loans on the scale that nations take add additional risk to the creditors - the risk of not paying them back is a final one.
If a million people don't pay their credit cards off in a year, it doesn't matter - the market is full of those that will. But if U.K. gilts don't pay out, then a lot more people are screwed, and there's a lot more incentive to keep that cash cow mooing.
That's fine, give the extra money directly to the person consuming electricity, which allows them to choose whether to use that money to fund the energy use if they need to, while encouraging
My company has a lot of equipment it keeps on needless because it's cheaper than the manpower to turn it off and back on. If demand is lower at the weekend they could use the money to shift some of their electric use to a weekend and pay people more to use it then.
Instead we are encouraging people to use energy as before, which was slightly more profitable than another option, rather than taking the same money but putting the power into the hands of the people using it. It's a paternalistic socialist approach which disempowered the person buying on the market, it's the exact opposite of what the conservative party should be about.
> This may not sound bad, but it's the equivalent of 200 kWh per year and the set is only supposed to draw 0.5-1W from the specifications
Assumming you never use TV (if I calculated correctly it's 219kWh for 24/7/365 standby 25W waste). Many people have switched on TV for 3-10 hours per day, heck I don't watch that much TV and I have it switched from 8-11 PM plus my kids around one hour a day. I'm sure my retired mother would easily do 8-10 hours per day.
Also I mean if you are bothered with it just unplug TV or switch off the socket.
My whole household yearly consumption including AC (26C) is ~1200 kWh.
I have a 2020 model I bought from Costco and as slow and shitty the UI is it also still burns 16W in standby.
There isn't a way to disable Chromecast either so thanks for nothing Sony and Google.
I need a dumb tv and Nvidia sheild.
Edit: I take the standby power usage part back. After settings changes including Eco stuff and something to do with remote start and a restart it's at 4W which is very acceptable.
You probably also need to turn off Network Power-On and IP control to prevent it from having network standby enabled. It's in different menus on my Sony.
Recently started monitoring all my appliances. 1W is what I also found to be the numbert that some reasonable devices (PC sleep, TV off) sit at. Although my subwoofer takes 5W at idle, my PC monitor takes 5W at idle, etc. Looking into some Zigbee smart sockets for my home setup now.
I have manually disabled most services on the TV manually, so not sure what extra it buys, since I do use some in-built apps for streaming as opposed to using the Nvidia shield, but it's good to know it exists.
Yeah my Sony tv from about that same year is the worst TV i've ever owned by far. Theres crazy bugs like when you leave a show or youtube paused it will just randomly start playing again after 10 minutes. The UI is very slow and unresponsive. If you smash the up/down volume too fast it disconnects from the soundbar. The remote is like something from the 1980s with 1000 buttons on it.
By comparison my Samsung TV is an absolute joy to use. The UI is responsive, works like you expect it to, The remote is wonderful. It really amazes me how Sony was a leader in the 80s and 90s then ate the dirt hard after the iPod and now 15+ years later they still haven't reorganized and got their act together to be a leader in anything except maybe camera sensors for smartphones.
Yeah this is what i've done, I ended up turning it into a dumb TV and hooked an Apple TV up to it to solve all the problems including the crappy remote. On my Samsung, I didn't have to do that as its more than good enough already.
The feature never even worked on mine, every time I tried to use it Chrome complained its version was too old or something similarly stupid excuse why it wouldnt work.
If you're wondering how much energy is consumed by your electronics, a base model kill-a-watt for about $30 will show you watts and cumulative watt-hours. It's a fantastic tool that pays for itself besides giving an eye-opening, interesting experience. You'd be surprised at the things in your house consuming electricity 24/7.
Almost everything in my house is either on power strip surge protectors or just unplugged.
I highly recommend the Athom smart home plugs preflashed with tasmota. No flipping cloud. None. Pick them up a fair bit cheaper than $30. And of course can turn them on and off over you local wifi - set timers over your wifi, or via mqtt for the whole smart home thing etc.
Point is they just work too if you don't want to do any of that. Plug them in, connect to their builtin wifi ap, read the power usage stats. Good button on the side that turns the power on and off too.
Esp8266 and a relay. Simple. Effective. Cheap. And just better than anything else I've seen in the space. Anyone has other recommendations I'm all ears.
You can also get power strips that toggle power based on how much juice the "control" plug is pulling. I have mine set up with the TV- when the TV is off, all the other plugs are disabled. But when I turn on the TV, it turns on power for the Xbox, Steam Link, etc
> You'd be surprised at the things in your house consuming electricity 24/7.
I was ready for that. But it turned out that besides an idling rPI, modem, router and APs, nothing actually does. I have a bunch of zigbee connected smart plugs, and when I got those, I checked all my devices. Turned out regulations really do help, because all the standby usage was miniscule.
Yea, but it keeps my food cool and thus edible for several days longer than it otherwise would. Power usage of a device isn't the problem, power used not doing useful work for me is.
A chest is much better. The reason is intuitive: if you open it the cold air remains inside except for a little turbulence at the top and, of course, the new air that pours in to replace whatever you’ve just removed.
Whereas, when you open an upright one, cold air immediately starts pouring out the bottom of the opening, with warmer air coming in at the top to replace it.
Your energy use is going to be much more impacted by how full you keep your freezer. For the same reason.. the more product you have in there, the less air you have in there.
Your chest freezer will typically use about the same about of yearly kWh as a standing freezer would, but the chest freezer will typically have about 30% more capacity.
So.. it's "better," but if you're not going to keep it full, you may not actually be saving any total energy.
Is it outlandish to think that a modern standing freezer with working seals will let less air out the seals than is exchanged by opening the door even a couple of times a week?
Modern standing freezers are starting to get drawers with sides in them now, presumably for reducing air loss while it's open. And seals are practically air-tight (if you've lived in a place with high humidity, even tiny gaps or holes in the seal causes noticeable condensation and water build up. So I'm skeptical about the seal claim. Better insulation may be true.
> a modern standing freezer with working seals will let less air out the seals than is exchanged by opening the door
I think the idea is that the seals can conduct heat, and do so much better than whatever abitrarily-sophisticated thermal isolation is in the walls of the freezer, not that the airflow through the seals is carrying any significant amount of heat.
The chest freezers I've seen have very similar seals to standing freezers, are you saying they're significantly better?
In ether case modern doors build out an insulating section around the seal, and the seal itself creates a second seal against that so it's air gapped inside the freezer as well. The seal itself of course is a hollow tube which is somewhat of an air gap with the outside too. They're pretty good, I would be surprised for their small surface area if they caused a huge conductive loss.
Chest freezers do have much thicker insulation on all walls (as they aren't constrained by the standard widths (24 to 40 in in the US, I think? 60-90 cm in the EU).
Chest freezer seals are far better than upright freezer seals because:
* The length of the seal is smaller compared to volume,
* The height of the seal is not constrained by the closing magnets (as mechanical latches are not legal on consumer freezers),
* The temperature differential at the top of the freezer is smallest.
The specific heat capacity of air at constant pressure is ~1 kJ/kg K and 1 m^3 air has a mass of ~1.2 kg, so 1 kJ/kg K * 1.2 kg = 1.1 kJ/K. Even a freezer that exchanges 2 m^3 of air for outside air at say a 70K differential (the difference between -40C and a 30C room) for each opening only has to remove 1.1kJ/K * 2 * 70K = 154kJ, or 0.05 kWh, which is the standard unit of electricity.
An unit of residential electricity in the most expensive US State (Hawaii) is 41.2c / kWh, so opening our enormous freezer in the worst possible conditions and completely exchanging the air using the most expensive electricity costs us 2¢. It's a complete non-factor.
Modern freezers really aren't that bad. Chest freezers being better than standing, because less of the cold air escapes when it is opened.
The US requires all consumer appliances to have an Energy Guide sticker (the yellow and black thing) which estimates an annual KWh use and cost at average electric prices. My cheapo 4cuft chest freezer uses about 200 KWh annually, or about $25. Not nothing, but $2 a month isn't going to register compared to most things.
I recommend Eve Energy, a bit pricey but works with HomeKit/Matter and 100% cloud-less. (No accounts, no remote access unless you have a HomeKit hub device, etc.) Also generates some nice detailed graphs and exports to CSV.
I found a pioneer amp that sucked 200W sitting there doing nothing. No increase when volume was turned up either. I compared to an older 1970s model that started at 3w and went up from there with the volume.
That's a difference between class A and B amplifiers. The former draw the same power all the time but are very linear, the latter don't consume much power at idle in exchange for small harmonic distortion.
My one misgiving with hard switching all devices daily is that they all have some kind of AC to DC power conversion with the cheapest capacitors imaginable that are subjected to high inrush currents every time I toggle them on.
I have a close spiritual cousin of one of those, and despite even touting a power factor display it tends to get confused by low standby loads and show nonsensically high power usage instead.
Chromecasts in general seem like big power hogs... I don't need you to use 20 watts all the time to show a shiny background image and a clock...
I really wish users who spend "Just" $/€/£ 40 on a Chromecast knew that it is going to cost them the same amount every year on their bill. I think many of those people wouldn't buy one if they realised they are almost buying a subscription.
Google could adjust them to just don't send out any signal unless someone is actually casting. The chipset can totally do 0.1 watt sleep mode with wifi connected.
> I don't need you to use 20 watts all the time to show a shiny background image and a clock...
Even the latest Chromecast (4K with Google TV) requires a 7.5W power supply [1], it can't possibly draw 20W. Also it looks like it idles at around 1W [2].
I just measured my Chromecast Ultra (earlier 4K model) at a wall power idle of 3.2 to 3.8 watts. Explains why it's always toasty. The Switch while charging two controllers, the TV, and the speaker bar idle at a total of 6.2 watts.
I wish the Switch would turn off rather than use its battery when I kill power to the cradle. At that point I could probably develop the habit of hitting the power strip switch.
It was a big eye opener for me when I realised that the Chromecast is always warm, even when the TV is off and it's not been in use for days. The extension switch is now always off, but I wonder how much this device cost me over the past years.
> Chromecasts in general seem like big power hogs...
The original Chromecast product was a stick that ran off a default 10W USB 2 power supply. I don't know what the current ones are drawing, but I'd be pretty surprised if it was any more.
This is a bug with this particular TV, which is running the Chromecast stack on a much less efficient SOC and clearly has some kind of integration bug which is preventing low power idle states. It's got nothing to do with Chromecast (or Android) as a product or protocol. They should fix the bug.
I love that my home screen is now jammed full of bullshit and it cannot be used without a Google account and if you use your personal Google account Nvidia needs a ton of permissions.
Other than that I actually do the love the little device but I feel more and more jerked around when I shell out only to have junk foisted on me later.
All the ads on the home screen suck now. When I first got my shield I made another account just for it because I didn't want it connected with the rest of my google stuff.
I thought about getting a Shield. But only as a low maintenance Plex Server appliance. I wouldn’t voluntarily look at any interface on my TV designed by an ad company.
I have a few Roku TVs around my house and I recommend them to most people. But I also have AppleTVs connected to my most used TVs.
Their customer service is awful. You have to downgrade the Chromecast built-in on all new models to get it working with certain apps, and there's nothing on the forums about it.
Hulu Live TV app is broken, forums are full of posts, absolutely nothing Nvidia is doing about it besides posting broken English instructions to reformat the device.
My friend sold me hard on the Nvidia Shield PRO and I went with two to replace my house's Rokus. Two weeks into all the problems, I reconnected the Roku for all the streaming services, the Shield is for Plex only.
Just checked my Sony Android TV from 2019 and found it is drawing 16W while turned off. Turning off the Chromecast built-in service, the remote start service, and enabling Eco mode did not seem to make a difference.
I already disliked this TV and was unlikely to buy another from Sony (for example occasionally it will “crash” while in standby mode and stop responding to any input, requiring me to unplug it and plug it back in). But learning this further damages my opinion of both Sony and Google.
I've got a sony google bravia and have disabled everything I can think of, chromecast, remote start, bluetooth etc. Cannot get it under 12W when switched off. Can't believe I got suckered like that. There really should be consequences. Nobody would reasonably think their tv is sucking juice like that when turned off.
Don't forget to check power usage after a few minutes. My old Panasonic TV from 2007 would switch to a pre-standby state initially, then to a "deep standby" after about 60 seconds when the power usage dropped below 1 watt.
Regarding Sony, I found them the worst offender for standby power in my home stereo equipment which I've now replaced.
How would you even go about this though? What store will accept a return “because it consumes too much power when off?” They most likely don’t even have tools available to verify the claim.
Even if you ship it to Sony, when will you see your refund?
Yes, if you took it to the store you bought it from either with a print off of the relevant parts of thst thread, or my smart plugs also show power consumption, I could screen shot that and take it in.
I honestly don't think you'd have an issue at all doing that in the UK at any of the main highstreet electronic stores, or somewhere like John Lewis.
Could be a poor implementation of Chromecast standard when integrated into the TV's.
My Chromecast connected to my TV is also powered by a USB port on the TV. It seems to turn off when the TV is in standby, I have to wait for it to initialize when I turn my TV on. This detail could have been easily lost over the years.
I do agree with people who say the Chromecast was best when it first came out. It did one thing well (albeit in an indirect way). Like all hardware, it has been slowly adding more and more features that will require it to draw more power, connect to the internet more, track more usage, integrate with more Google services, etc.
Television isn’t my primary choice for media, but when I do I just unplug it when I’m done. 0W standby. Same for basically everything other than the refrigerator and wifi.
If you do it right it's hardly a waste of time worth considering. My TV is plugged into an outlet that is controlled by a lightswitch, so I don't need to contort myself around furniture to reach to 'unplug' it. In other situations, I've used a power strip placed so the switch on the strip is easily accessible, sticking out from behind the cabinet. Even if the power draw on this TV is negligible, I think it's a habit worth maintaining because it costs me nothing and it will save me having to reacquire the habit in the future with new appliances. Also part of me feels reassured that technology has not yet rendered me too lazy to use a simple switch.
Yes. OLEDs run a short matrix refresh cycle every few working hours and a long one every hundred or so after you put them to standby; you probably won’t break anything, but the TV quality will degrade quicker. (Not sure about the not breaking part, though ;) I leave mine powered on.)
There are plenty of reasons to not get one and they don't matter since for me no other TV technology even remotely makes sense due to picture quality being so vastly superior than anything except plasma.
Since the topic is power draw, it is important to note that plasma TVs require much more power than LCD and OLED, at least when they are turned on. By how much is not trivial but it is usually at least double for a similar sized screen.
Of course, if your new TV is using 25W in standby, it doesn't really matter, but I am assuming something reasonable here.
I sold my 2010 panny plasma two years ago and picked up a Sony OLED and wow, what a difference. Once you turn off all the BS image processing and dial in the settings to your liking it blows plasma out of the water with brighter images with HDR, lower power draw and higher resolution. You're not forced to use the smart-tv non-sense either, don't wire it into your network, don't connect it to your wi-fi and only use a third-party device like a shield or roku.
I'd say you're missing out at this point IF you consume media regularly, especially movies. Otherwise you probably don't mind much.
My experience, having now bought several used plasmas from about that era, is that they tend to have extremely bright black points, comparable to a ~2015 mid-tier LCD screen. Fairly cheap modern LCDs seem to have superior image quality with sharper pixels as well.
Is this not your experience? Have I just been unlucky and purchased some aging panels? When I got into the market, I was of the impression that plasmas were renowned for their high contrast ratios, but I have been less than impressed.
> Fairly cheap modern LCDs seem to have superior image quality with sharper pixels as well.
I personally don't like the "sharpness", I like the image to be more "fluid", with the colours more easily "blending in" into one another (don't know what's the exact technical term for that).
A few days ago I watched a movie at one of my friends' place, she has a fairly recent LCD screen. I couldn't shake off the feeling of "being in the same room" with the actors, there was almost no "cinematography" involved anymore. I didn't like that, made it quite difficult for me to raise that fourth wall. I know of the HDR on-off setting thingie, but I don't think it was only because of that.
I agree with the burning in, only that in my case is not that severe, for what it's worth. There's some of it in the upper-left corner, from when I used to leave the TV open almost non-stop to a shitty music channel, that music channel's logo has left its mark in there. It doesn't "show up" all the time, only when the screen up there is all "white"/light up, and only at times, not always. Either way, manageable.
Yeah I don't even bother with the built in "smarts". Turn off wifi and bluetooth, don't plug in ethernet, and hook up a decent streaming box like an Apple TV or Shield. With that, it doesn't matter how outdated the TV's software is.
Even basic stuff like framerate matching is still missing after all these years.
They added it in beta a couple of years ago as a manual setting from the quick settings menu, but it's super buggy. It has never worked properly for me. Either the video completely freezes or the audio gets completely out of sync.
And even if you manually set the framerate to 24Hz or 23.9 Hz, there are constant random frame drops.
It's simple enough to say "use an exterior device" (which I do), but even input switching is managed in software and can be a nightmare sometimes. It would be great if there was just a light OS install. It'd likely bring the power use down as well.
Can recommend the receiver route, at least if you also want to have a decent speaker setup (even just 2.1 stereo+sub). Receiver firmware is more more minimal and no-nonsense, plus most receivers have physical remote buttons for all inputs which makes switching faster.
A modern phone with a 5 Ah ( ~ 18.5 Wh) battery can run for many hours streaming to a builtin screen, meaning it uses far less (anything over 45 minutes is more energy efficient) than a chromecast despite doing more. Same scenario with a laptop with a 60 Wh battery, which can also run a long time in idle, and even while streaming, and is able to resume from sleep in a second. Running more than 2.5 hours on battery means it's more energy efficient than a Chromecast.
When you are casting your smartphone on a built-in tv, the TV is doing all the work (downloading and decoding the signal). Your phone is just a fancy remote control.
The Chromecast is downloading the video and sending it to the TV on the contrary.
I have a different model of Sony TV that I bought in 2019 that consumes 16W in standby. Not as bad as 25W, but still far higher than I would expect any modern device to consume in standby.
I have every single device plugged in to a switchable extension cord for this reason. No need to trust a manufacturer, not even a 1-5W consumption (times the number of devices).
It feels like over optimization. The base load of my house is around 1kW which is still completely irrelevant in the scheme of how much owning the property costs. The literal cost you’re adding as well as the mental cost of having to switch things on and off all the time shouldn’t be ignored. There’s a timer on the outlets for the soldering gear though, that adds a bit of peace of mind.
I have a fridge freezer, and last time the compressor kicked in my kill-a-watt meter says it used about 400w for about 5 minutes. It does that maybe 2-3 times a day.
Came with the house, but it's a Samsung that looks vaguely something like this [0], at a guess bought in 2019 as that's when the previous owners replaced the kitchen.
It's the obvious, but we turn things off/disconnect when not in use, and we buy the most power efficient versions of devices whenever we replace anything.
My desktop with its Titan Xp is the exception to the rule, but it isn't on much these days.
I would imagine most people don't go behind their TV hook up their little HDMI dongle every time they want to use it, so it seems reasonable to have a comparable wattage drawn when it's built-in -- but this is significantly higher than what it looks like an idle Chromecast draws (1.8-2W). Why is it so much more?
I own a Sony A80J and this is also what I was measuring.
I'm thankful I did measure this before just blindly turning the feature on.
I have a Bose Soundbar also with Chromecast and that one doesn't suck down 25W+ while on standby and can thankfully turn on the TV via HDMI CEC.
Sadly the whole OS experience is shitty on the TV. It's 2500€+ TV and the UI is sluggish, Tv takes at least 5-10 Seconds to turn on each time and still comes with the classic google ad like interface.
I bought a Chromecast remote that works via bluetooth to somewhat make the remote feel a bit faster, but it sadly just makes a tiny difference.
Nvidia Shield still feels like the only real option when you want a non sluggish Interface on your multi-thousand euro/dollar purchase.
I have a Sony OLED TV from 2020 with Android TV and Chromecast, and power usage on my UPS is reporting as 0W when off (probably rounding down). I never had to disable it, but I also never setup network connectivity or the Android TV features in the first place.
My new Philips dumb TV just arrived this weekend and I must say I'm impressed with the power draw. In default calibration, it draws 24w. But since it's in the bedroom and is only used at night, I've set it to ultra-eco mode and lowered the brightness for night viewing. Now the TV along an Amazon Fire TV stick powered by its USB port draw a combined 13w! In standby mode, the power draw doesn't even register on my meter.
For a larger TV (like 50-60 inches) a good on-paper power consumption amount would be between 50 and 70w. You can lower that drastically if the TV has an eco mode and is a dumb TV. In Europe they have to provide a "Product fiche" that lists a number of things including power draw, and an overall "grade". TVs basically never go below grade E because the rules are far too strict.
Oh, also the power LED on this TV is SUPER ANNOYING! I had to open the TV (just a bunch of philips screws), pull out the IR receiver circuit board, and cover the LED with electrical tape (the LED, not the IR receiver).
Ooof 768p is a dealbreaker for me unfortunately. Especially since it is meant to be used in a smaller room where you may be sitting closer.
While I have your attention, are the energy ratings seen on electronic devices generally a big jump in real world usage between the various tiers? I just moved to the EU from the U.S. so all of this is very new to me.
For TVs the jumps between grades isn't that big TBH, like maybe on the order of 10-15w per grade. You can get a feel for it by searching for TVs on one of the European amazon sites and clicking on the product fiche link for each one.
My living room TV consumes 70w (as measured by me), which is not the best, but considering how often it's actually turned on it doesn't make enough of a difference. 10w of power draw equates to 3.5kwh per year if the TV is only on for an average of an hour per day, which would cost me a little over 1 euro. The reason why I replaced the bedroom TV was because it didn't have a shutoff timer, so it would stay on for days if I didn't notice, drawing 37w the whole time (325kwh, or 115 euros per year if it stayed on the whole year).
The biggest ticket items for energy efficient replacement are any appliances that stay on long term (fridge), your oven if you use it a lot (get an air fryer or a combined microwave/grill), any computers that stay on all day, and anything with bad standby consumption (get an energy meter like this one: https://www.amazon.de/gp/product/B00MHNGWDM).
Luckily, I've moved into a brand new apartment so the energy efficiency is really great. The heating is in-floor radiant using the same gas boiler as the hot water in the apartment.
Unfortunately, I apparently picked the worst country to move to when it comes to energy prices in the midst of this war. The Netherlands is absolutely screwing the population by not capping gas/electric prices. Hopefully things stabilize before the winter really sets in or I think many people will be driven to bankruptcy just to pay the electricity bill. My current rate is almost €1.10/kWh. When compared to the paltry 9¢ I was paying in the US, I about had a heart attack when I saw that rate.
I'm trying to limit cooking and hot water, but only time will tell if my efforts are going to make a difference. In the meantime, I'm going down the street to the library to use their facilities and charge my laptop and download movies. I've only just arrived so I don't have any internet setup at home yet. I'm certainly going to have an interesting few months ahead!
This looks similar - same features list, same standby power consumption, higher running power but not bad, no mention of Android TV, 1080p: "Philips 43PFS5507/12 43 Zoll LED Fernseher Für Kleinere Räume"[0]
It so well answers what I want when the seldom-used 2010 55" Sony Bravia mounted above the piano goes that I'm considering buying one right now while Phillips is still selling them.
Thanks for pointing me in this direction, kstenerud!
Nice! 1080p is feasible. I would like 4K, but I haven't found any that seem to exclude smart features with that resolution. Likely because all of these units are old-new stock that hasn't sold in the past 5+ years.
I might just need to do some work to find units that respect turning off the smart settings and don't become severely handicapped in the process. A European brand like Philips probably fits that bill better than most. I've heard horror stories about Samsung and Sony TVs attempting to phone home by any means necessary.
One important thing to know about the EU energy ratings: there is a newer standard since 2021, and an appliance that was “A++” in the older standard might be “G” in the new, but the old rating is still on display because it wasn’t retested on the new.
I found something like that on my Sony STR-DN1080 receiver as well - I bought it here in EU and found out that it can't be powered on via AirPlay/Chromecast/Spotify like Denons or Maranzes can.
Why? Because it apparently uses 30-35W (!!!) while being in so-called "Network standby" mode which was not legal per EU regulation. The US/Asia models happily gulp down that much power by default though.
Whenever I read something like this I am wondering if there was still some TV without any of these „smart“ features. Just a screen and the receiver for television.
Then I can connect an Apple TV, Xbox, PlayStation or whatever. Isn’t there a demand for that?
I mean I don’t need another device which breaks or is unsafe to use after 2 years because it doesn’t get any more updates.
I have 2 Sony Android TVs and never connected them to the internet in 5+ years. I use it just as you described, the Apple TV drives everything. No updates in 5 years and not a single issue.
I have a Sony Bravia from the 2010s (a dumb one) that has a standby mode, engaged when you power off the TV via remote.
The TV also turns off-off when you use the physical button, but there's seemingly no functional difference, the unit still powers on via remote if fully off.
I have a feeling that all the physical power button really does is put the unit on standby, but also turn off the little red standby LED.
How is that even possible? A tablet at full power doesn't use that much. That's way above and beyond what you'd expect even a laptop(Or at least one I've owned recently) to do.
I don't even think something like a RasPi 4 physically can pull that much.
Is this chip shortage related and they just couldn't find some regulator or something and used linear?
I even have a separate chromecast connected to my Sony tv. The builtin doesn't appear until the TV has booted up and is nicely ready for use. While the original chrome cast ultra I can reach no matter what, and casting to it starts the TV up.
I found it's simplest to just put the power through a IoT smart plug and tell Google/Alexa/Siri to turn it/them off/on when I want it. Standby draw on those things is very low, 1-2Wh.
Chromecast doesn't have a power sleep mode right? I e always wondered why especially since it can CEC power on when asked to. Maybe there's a device that I can use to proxy between Chromecast and TV to do this?
Bought a mid tier bravia from best buy and loved it at first, but the ui has become so slow and is bombarded with ads. Also cannot disable featured ads and app suggestions/channels from Sony. I hate it.
It's really to the point now where we need to ban "smart" TVs, since every single one of the manufacturers is demonstrably unable to handle the responsibility of engineering these things correctly. Sony can't do software, Samsung can't, Google as a proxy is even worse than letting Sony do it, TCL can't, Apple doesn't make a TV. Nobody can handle it. And it's bad for consumers even IF they could do it competently, since TVs need to last a decade and none of this software is decently reliable for a decade's worth of upgrades.
Fucking ban it and be done with it. If the Euros are going to force Apple to adopt USB-C, which is idiotic, then we can do this, too, which would actually do some good.
My TV (and everything connected to it) is connected to a power strip which is connected to a HomeKit switchable outlet. When I'm not watching TV the entire power strip is simply turned off.
That's a good question. I've never measured it... until right now. According to my plug-in power meter, half a watt. Which I think is the lowest amount the meter can display.
How much does an Apple TV 4k 2nd gen use? My 4k first gen turned off when I turned off the TV. The 2nd gen does not. I assume it has something to do with the iot support
Never bothered me to login. FF 104 with ublock origin. Black on white.
Relevant posts:
> I measured the standby power consumption on my set, and it uses an extreme 23 W, or 200 kWh per year
For context, this would be approx 10% of the average consumption of a German...
> My Apple TV uses 1.2 W in standby
> Have anyone else measured their set and seen similar figures? Any way to lower it?
> I tried setting Eco to high, and that lowered it by a whopping 0.5 W in standby to 22.5...
> Thanks, I had already turned off remote start (it lowered standby by 0.5 W). Remote Device Settings did not affect it much either
> I reset the TV to factory settings, set it up without logging in to Google, and declining all privacy policies and "extra" services. Still idling around 23 W
> After some trial and error, disconnecting the ethernet cable seems to be the trick. Then the TV is able to idle at 0.5 W after a few minutes
> Something must be terribly wrong with the software though, that keeps the SoC active if it has internet connectivity? I found in the Norwegian specs that "network standby" should draw only 1 W
> I'm no longer using the smart features after I upgraded to an Apple TV last year, so in principle this is not a problem for me to lose Chromecast, but it is quite irresponsible of Sony to have such buggy firmware
[EDIT: I forgot to include the concluding post:]
> Indeed, I disabled Chromecast from the app menu and then the SoC is powering down to the 0.5-1W levels after a minute or so
> So to summarize:
> If the TV has a network connection and Chromecast is enabled, the SoC is not powering down such that the set idles at roughly 25W instead of the 0.5-1W per the specifications
> This is not good if it applies to more people than me, as each set would consume about 200 kWh per year in standby
> German households consume 130 TWh electricity per year, or 1500 kWh per person. So this is sizeable in an European setting, and would help tremendously with the electricity saving targets that may come this winter due to the war
> Now, how to get the attention of Sony and Google to fix this? This may even be in breach of the ecodesign guidelines!
Honestly, I can't give a fuck, dude. We have unlimited clean energy at our fingertips. If you want me to give a fuck about this, first let go of that. I'm for energy production maximization so that we can stop worrying about whether this is a problem.
This would cost more than 50 EUR per year at current prices, and is sizeable compared to the average German citizens consumption of 1500 kWh electricity
Let's just say this is not ideal given the current energy shortage in Europe...