Hacker News new | past | comments | ask | show | jobs | submit login

It also is illegal in the EU for new devices. https://ec.europa.eu/info/energy-climate-change-environment/...:

“Network-connected standby devices

Modern appliances are increasingly connected to the internet during standby mode, consuming higher amounts of energy. This includes networked televisions and decoders, printers, game consoles and modems.

- Specific requirements for network-connected standby devices were introduced in 2013.

- Since January 2017 networked standby devices must not consume more than 3 to 12 Watts depending on the product.

This compares to 20 to 80 Watts previously. This decrease is expected to save an additional 36-38 TWh.”

I think this TV is from 2014, so it would have to comply with the older regulations, which I think are https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A..., but that doesn’t mention anything close to 80W.

It does mention 6W and 12W limits, but only starting at January 2015.

⇒ it seems this device doesn’t break that regulation.




I have a Sony android tv and with almost every setting (eg. brightness, motion blurr processing, etc) I change I get a warning it might “increase power consumption”. So it might just be it is testen under the bare minimum functionality. But as soon as you enable any of the advertised features it exceeds that.

For the record mine mostly has 1-2W standby consumption. With some spikes to 10 or 20W for a few seconds sometimes. And once in a while it never enters deep sleep and gets stuck on 20W. Enabling HomeKit (Apple TV) on standby is a sure-fire way to have it standby on 20W permanently. I can’t disable it after unless with a factory reset. I’ve not touched Chromecast settings at all so can’t say if it has the same effect.


There were devices that consumed 80 watts in standby. I have a cabinet full of equipment including my cable modem, a computer (server, no monitor) with 8 mechanical hard drives. The idle current is 76 watts or so. How could something be so high?


Cable modems and servers weren’t designed for standby.

Also, if we’re talking consumer tech from the 1970’s or 1980’s, I think it’s twofold.

Firstly, the market cared more about “time to wake up” than about power usage. People had fewer devices, they were (relatively) more expensive, and people were used to fully switching off stuff, so standby power usage wasn’t seen as a big issue.

Secondly, the tech itself didn’t exist. Nowadays, your power supply has a CPU that knows how much power your hardware needs and can optimize for it. Then, you would be lucky if your hardware had a switching power supply.

CPUs didn’t have power states. If you were lucky, you could halt them and have them start again on interrupt. Hibernation didn’t exist, so in computers you’d have to power RAM, anyways. And because of that, OSes didn’t bother with the complex task of properly powering down peripherals when going on standby.


In the 80s, your VCR would frequently be plugged in, and it would keep pumping out a placeholder blue screen at a minimum, particularly if connected via antenna passthrough.

However startup time would be minimal - a couple of seconds - compared to warmup for the CRT, which would usually be at least 10 seconds before it would look settled. But I recall devices generally being much faster to start up. A C64 would boot to READY prompt in 3 seconds. The penalty for turning off completely and needing to wait for boot was small, and for computing devices, wait time was dominated by loading, not booting. You might have to wait for a rewind with a VCR, and you'd certainly wait for tape loading on devices in the C64 class. Consoles with cartridges didn't take long to boot up and load though.


What power supply has CPU in it?


Every macbook power supply from who knows when does.

Here's a teardown of a 2015 macbook pro charger that includes a Texas Instruments MSP430.

https://www.righto.com/2015/11/macbook-charger-teardown-surp...


That's a microcontroller.


> The MSP430 is a mixed-signal microcontroller family from Texas Instruments, first introduced on 14 February 1992.[1] Built around a 16-bit CPU

From Wikipedia. Emphasis mine.


'standby' for many devices used to mean 'leave all the electronics powered up, but turn the screen/sound off'.


I had a DirecTV satellite receiver with DVR. They rolled out an update and made a big deal their new "Energy Star ECO power savings mode."

I wondered how much power it could really save because it was a DVR with a spinning hard drive that would always record the last 30 minutes of the current channel even if you didn't set it to record anything explicitly.

So I measured it while on and also in "power savings mode" where the video output was off.

14.5W when on and 14.0W when in "power savings mode"

What a joke.


> 14.5W when on and 14.0W when in "power savings mode"

A software upgrade that saves 3% across hundreds of thousands of people doesn’t seem that pointless in the aggregate.


the screen is the primary consumer of energy. i don’t know but imagine sound is likely more hungry in that regard than any of the other discrete systems on a smart tv.

edit: additionally tvs that don’t turn off are in some cases of limited utility.


Not tellies but an interesting look by Dave Jones[of EEVBLOG fame] into how/why simple devices can use up so much power when idle. A lot of it is down to lazy design of the AC>DC circuits

https://www.youtube.com/watch?v=_kI8ySvNPdQ


It’s not lazy, it’s cost optimized. Like so many times, market forces just force any externalities outside the price signal.


It's easy to burn energy when nobody's paying it any attention. Mostly by not really idling, I suspect


Audio amplifiers, especially old ones, can easily draw 100 watts when on 'standby' - ie. the amplifier is turned on but playing no sound.


> 'standby' - ie. the amplifier is turned on but playing no sound.

this isn't standby and is a good way for your vintage amp to become defective


If it is a class A amplifier, I think that would make sense. But how many people have huge class A amplifiers for their home audio?


Old ones...

Everything from the 60's, 70's, 80's and 90's...

Remember in europe we have a lot more old stuff - especially old houses that are family owned for 100+ years tend not to have things replaced often unless they're unrepairable.

Even modern class D amps can be 30+ watts when not playing anything tho.


That's interesting, I've never thought of people with old houses correlating with old tech.


> If it is a class A amplifier, I think that would make sense.

The much more standard class AB still uses 25% with no output, 100 W idle would be a stereo 200 W amplifier. Or a 5x 80 W surround amplifier.


You mean idle "server turned on but not used" or completely off ?

Servers usually have out of bands management builtin and that's on all the time, even if management network is not connected. Should be just few watts tho


Running but not really doing anything. No IO, no CPU activity, etc.


Could there be some consequence for misrepresenting the standby consumption?


The TV came on market in 2015 AFAIK, the C denotes the 2015 model year


If it’s like cars, they started production and sales 3/4 through 2014


Don’t think so, the thread started in April 2015 :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: