
Samsung TVs appear less energy efficient in real life than in tests - davidbarker
http://theguardian.com/environment/2015/oct/01/samsung-tvs-appear-more-energy-efficient-in-tests-than-in-real-life
======
ChuckMcM
Ok, so this is taking on the feeling of an Internet Meme, "Your <noun> has
less/more <property> than the testing said it had."

Is there anyone here who doesn't believe if you make a testable property a
price effecting feature of a product, the manufacturer will find a way to
optimize for the property value they want, at the expense of the consumer or
testing agency?

~~~
modoc
Interestingly many auto manufacturers UNDERSTATE horsepower and torque
numbers. I can understand that a little, due to variances in actual engines,
but these days that variance is low and the spread between specced power and
dyne'ed power is pretty big on many cars.

~~~
drglitch
Almost no manufacturers do it - historically, BMW and MB have been "caught"
doing this, with 335i and C-class AMG engines.

A commonly accepted explanation for this is that the "cheaper" model is made
to appear less powerful on paper to entire a buyer to upgrade to next model up
(i.e. M3).

Relevant tests from 2010 ~ when this meme started:
[http://www.automobilemag.com/reviews/driven/1009_bmw_335i_33...](http://www.automobilemag.com/reviews/driven/1009_bmw_335i_335is_dyno_test/%20-%20/)

In reality, almost all cars "produce" far less ponies at the wheels than
stated on spec sheets because HP is "[S]ampled [A]t [E]ngine" \- so by time it
reaches your wheels, about 20-30% are lost to drive train.

Edit: formatting

~~~
matthewmcg
You're correct, but one minor point: SAE is actually an abbreviation for
"Society of Automotive Engineers," a standards and certifying body.

The difference you are describing is between "gross" horsepower (SAE J1995
spec.), essentially horsepower measured at an unloaded crankshaft and "net"
horsepower (SAE 1349 spec.) or the horsepower measured at the wheels.[1] The
difference is due to drivetrain losses and power used to drive non-engine
components.

[1] See [http://www.edmunds.com/car-technology/horsepower-gross-vs-
re...](http://www.edmunds.com/car-technology/horsepower-gross-vs-real-
world.html)

------
MikusR
And here is response from Samsung: [http://global.samsungtomorrow.com/samsung-
firmly-rejects-the...](http://global.samsungtomorrow.com/samsung-firmly-
rejects-the-guardians-article-on-tv-compliance-testing/) and
[http://www.samsung.com/global/article/articleDetailView.do?a...](http://www.samsung.com/global/article/articleDetailView.do?atcl_id=23)
"• Motion lighting – Reduces power consumption by reducing screen brightness
when the picture on the screen is in motion"

~~~
acqq
The claim from the article is that the "motion lighting" feature (or something
that Samsung attributes to it) activates just when the content is played
"faster than normal" and that it effectively recognizes that the international
electrotechnical commission (IEC) test is in progress.

If Samsung would really document the exact conditions for the activation and
they would be repeatable and independently confirmed then I'd believe it's a
real feature. Somehow I don't expect that I'll read that though.

But it's also true that the modern devices don't have constant power load and
that it's a good thing and that regulatory bodies didn't complain for even
things still in use: My old PC used 30 Watts when "powered off" (!) and my
cable box does that still (!) same 30 Watts load either powered on or powered
off. That's a real sucker. The TV set uses a tenth of Watt when powered off.

------
unfocused
I did some testing back in 2009 and I just looked up my spreadsheet and it
really is hard to say definitively how much your TV will consume. At the time,
my TV was a Samsung LCD A530 (I think).

For example, usage based on input type (Component vs. HDMI):

TV (Wii as input, Component) 48 to 68W

TV (PVR Cable as input, HDMI) 84 to 104W

Then you start getting into different TV settings for games versus TV, day
versus night, and the list goes on.

So what should the TV manufacturer put? Should they put the min consumption
which is probably the dimmest setting? Or the highest? Or the average? It is
an interesting question.

~~~
prawn
The default setting? The one shown most in the demo mode?

~~~
JoshTriplett
> The one shown most in the demo mode?

Which would lead TV manufacturers to introduce hacks that reduce power
consumption solely for the demo mode.

~~~
prawn
Wouldn't that mean that the product is showing content at its most pleasing to
customers but with reduced power consumption?

~~~
JoshTriplett
Random example from a moment's thought on how to cheat: demo mode typically
runs in a store display, and likely has little opportunity for evaluation of
audio (unless placed in a separate showroom as some stores do for a home-
theater demo, but in that case it'll likely get tuned carefully). So, in demo
mode, don't bother turning on the higher-energy portions of the audio system
at all, because nobody will notice if the audio sounds tinny. And now you've
potentially saved tens of watts in demo mode.

(Also, note that demo mode typically tunes for most attention-grabbing in a
showroom, which often does not provide the optimal viewing experience at
home.)

------
drawkbox
Testing for devices should not be done at the onset only. It should be done
over time in the field through an independent study.

The Volkswagen issue would have never happened. It also makes it much harder
to game, anything that can be gamed in cutthroat competition will be.

Over time we could also see products that hold up through wear and tear.
Consumers should demand this type of testing after the gaming has been
exposed.

~~~
wrsh07
Business idea: Make it easy for consumers to test their own devices / vehicles
and aggregate statistics [of users who want to upload them].

~~~
donpdonp
Its easy to start - get a KillAWatt

[http://www.p3international.com/products/p4400.html](http://www.p3international.com/products/p4400.html)

------
adrianN
I don't understand how the official EU test can be such that it differs from
"real world usage". Why don't they just record a couple of hours of real TV
shows, play them back and plug in a Watt meter?

~~~
duskwuff
> Why don't they just record a couple of hours of real TV shows, play them
> back and plug in a Watt meter?

That's essentially what they do. The allegation is that this manufacturer's
TVs recognize the specific clip used for testing and drop their brightness
when they recognize it.

~~~
mahouse
That makes me remember of tricks used by video cards manufacturers to cheat on
the WHQL tests
[http://blogs.msdn.com/b/oldnewthing/archive/2004/03/05/84469...](http://blogs.msdn.com/b/oldnewthing/archive/2004/03/05/84469.aspx)

Also of how Samsung disabled speed scaling on their smartphone GPUs during
benchmarks.

------
Animats
There's no mention of standby power requirements. There was a big push a few
years ago to get devices to cut down their standby power substantially. How's
that coming along?

~~~
Someone
There is a EU directive on that since december 2008 ([http://eur-
lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:320...](http://eur-
lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32008R1275))

It basically limits standby or 'power off' power consumption to 0,50 W.

The EU isn't afraid to fine companies heavily, so chances are most equipment
follows this rule. If not, I guess a few companies are sweating now that
people start to question compliance of devices.

I also guess this directive has had implications all over the world, just like
the ban of lead soldering in the EU did. once you have spent the money to
design a low power solution, economies of scale do not make it worthwhile to
keep an second design around for use outside of the EU.

I think 0,5W still is too much, though. With 20 plugged in devices (sounds
like a lot, but count them in your house: coffee machine, water cooker,
blender, audio system, televisions, computer, monitor for computer, external
hard disk, printer, etc.), that still is a quarter of a kWh a day.

Certainly, home automation systems should aim to go way lower per power
outlet.

~~~
Karunamon
Ugh, this explains that crap pair of Creative speakers I ordered some time
back. The speakers themselves are very nice and sound good, but they have this
_infuriating_ habit of powering off if no sound is received for ~5 minutes.
(And they don't automatically power back up when there's sound detected)

Now it sounds like this was to comply with this inane EU directive. Aren't
there some lower hanging fruit to pick like major appliances before
hamstringing comparatively lower users like consumer electronics?

~~~
Animats
Such devices need a standby power supply, one that delivers a few mA, to drive
the circuitry that wakes things up. Or a power supply that goes to an ultra
low power mode when its load drops. Both are available.[1][2] They add to the
parts count. The cost penalty is minor for a TV, but noticeable for really
cheap devices like speakers and wall wart power supplies.

This is one of those problems that yields to regulatory pressure. With some
pushing, a solution like [2] below becomes standard, and adds about a dime of
parts cost to everything that plugs in, probably paying for itself in the
first month of usage. Without such pressure, all the crap power supplies don'
have it.

[1]
[http://www.ti.com/lit/an/slua116/slua116.pdf](http://www.ti.com/lit/an/slua116/slua116.pdf)
[2]
[http://www.onsemi.com/pub_link/Collateral/NCP1015-D.PDF](http://www.onsemi.com/pub_link/Collateral/NCP1015-D.PDF)

------
_Ogre
This isn't very different from the benchmark cheating shenanigans Samsung has
been busted doing previously.

[https://duckduckgo.com/?q=samsung+galaxy+benchmarks+cheating](https://duckduckgo.com/?q=samsung+galaxy+benchmarks+cheating)

Slightly different media, same idea. Check for testing criteria, vary the
output.

------
ihsw
Looks like the European Commission (EC) needs all these product test results
to be online and independently verifiable.

I'm sure members of the whole political spectrum can agree that uninhibited
full-disclosure would benefit us all (except lazy/ malicious/ stupid
manufacturers).

------
spyder
At least some Samsung TVs has an option to show information about the power
consumption. I don't think it's directly measured because it doesn't change
between light and dark scenes but it does change when picture settings are
changed and at least it gives some estimate: [http://support-
us.samsung.com/cyber/popup/iframe/pop_trouble...](http://support-
us.samsung.com/cyber/popup/iframe/pop_troubleshooting_fr.jsp?idx=146980&modelname=UN55B7100WF)

------
venomsnake
Coming up next - washing machines, air conditioners and fridges.

~~~
kuschku
Maybe we’ll find out that washing machines eat on average one sock more during
normal operation than during testing?

Maybe we’ll even find out that iPhones don’t even contain an Apple?

------
mrbig4545
Is this news? Because I've always assumed the tests are very biased. my car
says it can get 70mpg in extra urban, yet it gets 54 max, I knew 70 was a lie
when I bought it. Same with the tv, light bulbs, fridge and so on. In no way
is any of it real world efficiency.

------
NumberCruncher
The only statistics you can trust are those you falsified yourself: Winston
Churchill

Seems to apply also to tests...

------
cabirum
It all began a long time ago, when 1Kb was equaled to 1000 bytes.

~~~
dandrews
Yeah well, "K" means "1000", always has. The metric system got there first! We
tried to co-opt "K" for years, but we should admit the error and use Kib when
we mean 1024 bits.

------
silveira
Soon: "Software appear less efficient in real life than in tests. Test-driven
development considered harmful."

~~~
chadgeidel
It's already happened. Many here probably remember the ATI "Quake/Quack"
debacle where the ATI driver was changing the rendering pipeline based on the
binary name:
[http://www.hardocp.com/article/2001/10/23/optimizing_or_chea...](http://www.hardocp.com/article/2001/10/23/optimizing_or_cheating_radeon_8500_drivers)

~~~
semi-extrinsic
It's no secret that GPU drivers are tuned differently to different games.
That's a feature, not a bug. You can't expect every game to conform to some
super-tight code conventions that optimize against the GPU driver, so Nvidia
and ATI optimize their drivers for the various games.

~~~
chadgeidel
I'm not sure if this is what you are referring to, but in the Quake/Quack
incident the texture quality and mip levels* were different _than the user-
specified values_ when running a different named program. This was producing
obvious visual artifacts, and I consider this cheating.

If I specify "16x Anisotropic filtering" and the driver changes that to "4x"
or some other number behind the scenes, that's an unacceptable optimization.

The current "GeForce Experience" program that selects optimal settings for the
current card is an acceptable optimization, I can see which settings are
configured and change them if I like.

[*] I don't remember the specific things they were adjusting.

------
astrodust
Looks like someone's uncorked the truth and it's spilling everywhere.

------
gluecode
Are we going to have a Volkswagen for TVs now?

