Modern appliances are increasingly connected to the internet during standby mode, consuming higher amounts of energy. This includes networked televisions and decoders, printers, game consoles and modems.
- Specific requirements for network-connected standby devices were introduced in 2013.
- Since January 2017 networked standby devices must not consume more than 3 to 12 Watts depending on the product.
This compares to 20 to 80 Watts previously. This decrease is expected to save an additional 36-38 TWh.”
I have a Sony android tv and with almost every setting (eg. brightness, motion blurr processing, etc) I change I get a warning it might “increase power consumption”. So it might just be it is testen under the bare minimum functionality. But as soon as you enable any of the advertised features it exceeds that.
For the record mine mostly has 1-2W standby consumption. With some spikes to 10 or 20W for a few seconds sometimes. And once in a while it never enters deep sleep and gets stuck on 20W. Enabling HomeKit (Apple TV) on standby is a sure-fire way to have it standby on 20W permanently. I can’t disable it after unless with a factory reset. I’ve not touched Chromecast settings at all so can’t say if it has the same effect.
There were devices that consumed 80 watts in standby. I have a cabinet full of equipment including my cable modem, a computer (server, no monitor) with 8 mechanical hard drives. The idle current is 76 watts or so. How could something be so high?
Cable modems and servers weren’t designed for standby.
Also, if we’re talking consumer tech from the 1970’s or 1980’s, I think it’s twofold.
Firstly, the market cared more about “time to wake up” than about power usage. People had fewer devices, they were (relatively) more expensive, and people were used to fully switching off stuff, so standby power usage wasn’t seen as a big issue.
Secondly, the tech itself didn’t exist. Nowadays, your power supply has a CPU that knows how much power your hardware needs and can optimize for it. Then, you would be lucky if your hardware had a switching power supply.
CPUs didn’t have power states. If you were lucky, you could halt them and have them start again on interrupt. Hibernation didn’t exist, so in computers you’d have to power RAM, anyways. And because of that, OSes didn’t bother with the complex task of properly powering down peripherals when going on standby.
In the 80s, your VCR would frequently be plugged in, and it would keep pumping out a placeholder blue screen at a minimum, particularly if connected via antenna passthrough.
However startup time would be minimal - a couple of seconds - compared to warmup for the CRT, which would usually be at least 10 seconds before it would look settled. But I recall devices generally being much faster to start up. A C64 would boot to READY prompt in 3 seconds. The penalty for turning off completely and needing to wait for boot was small, and for computing devices, wait time was dominated by loading, not booting. You might have to wait for a rewind with a VCR, and you'd certainly wait for tape loading on devices in the C64 class. Consoles with cartridges didn't take long to boot up and load though.
I had a DirecTV satellite receiver with DVR. They rolled out an update and made a big deal their new "Energy Star ECO power savings mode."
I wondered how much power it could really save because it was a DVR with a spinning hard drive that would always record the last 30 minutes of the current channel even if you didn't set it to record anything explicitly.
So I measured it while on and also in "power savings mode" where the video output was off.
14.5W when on and 14.0W when in "power savings mode"
the screen is the primary consumer of energy. i don’t know but imagine sound is likely more hungry in that regard than any of the other discrete systems on a smart tv.
edit: additionally tvs that don’t turn off are in some cases of limited utility.
Not tellies but an interesting look by Dave Jones[of EEVBLOG fame] into how/why simple devices can use up so much power when idle. A lot of it is down to lazy design of the AC>DC circuits
Remember in europe we have a lot more old stuff - especially old houses that are family owned for 100+ years tend not to have things replaced often unless they're unrepairable.
Even modern class D amps can be 30+ watts when not playing anything tho.
You mean idle "server turned on but not used" or completely off ?
Servers usually have out of bands management builtin and that's on all the time, even if management network is not connected. Should be just few watts tho
“Network-connected standby devices
Modern appliances are increasingly connected to the internet during standby mode, consuming higher amounts of energy. This includes networked televisions and decoders, printers, game consoles and modems.
- Specific requirements for network-connected standby devices were introduced in 2013.
- Since January 2017 networked standby devices must not consume more than 3 to 12 Watts depending on the product.
This compares to 20 to 80 Watts previously. This decrease is expected to save an additional 36-38 TWh.”
I think this TV is from 2014, so it would have to comply with the older regulations, which I think are https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A..., but that doesn’t mention anything close to 80W.
It does mention 6W and 12W limits, but only starting at January 2015.
⇒ it seems this device doesn’t break that regulation.