Is there anyone here who doesn't believe if you make a testable property a price effecting feature of a product, the manufacturer will find a way to optimize for the property value they want, at the expense of the consumer or testing agency?
> When a measure becomes a target, it ceases to be a good measure.
For a TV, meh. It's a little more energy used, and financially barely is noticeable by the owner. When you're done it's essentially disposable.
For the cars though... oh wow. It hits taxation levels in some countries, and fuel cost everywhere. It hits resale value, which likely hits financing companies. It actually causes additional pollution in crowded cities. It crosses thresholds in which large swathes of the market would choose not to purchase the vehicle based on those numbers (business leases).
Everyone knows that advertised numbers are approximations based on ideal conditions on a very small sample. The question is really how misleading those approximations are, and what the impact is on the consumer and any third parties.
There are so many TVs and they run so many hours a day the aggregate energy use is non-negligible.
Consumers should avoid products from firms that game tests such as these. Toyota, for example, was singled out in the fuel consumption report as a company that did not cheat on the tests.
An analogy is a student studying for the exam. That's gaming it somewhat too. The exam is supposed to test what they learnt during the semester and retained over a long period, not what they crammed the night before and will forget by next week.
A commonly accepted explanation for this is that the "cheaper" model is made to appear less powerful on paper to entire a buyer to upgrade to next model up (i.e. M3).
Relevant tests from 2010 ~ when this meme started: http://www.automobilemag.com/reviews/driven/1009_bmw_335i_33...
In reality, almost all cars "produce" far less ponies at the wheels than stated on spec sheets because HP is "[S]ampled [A]t [E]ngine" - so by time it reaches your wheels, about 20-30% are lost to drive train.
The difference you are describing is between "gross" horsepower (SAE J1995 spec.), essentially horsepower measured at an unloaded crankshaft and "net" horsepower (SAE 1349 spec.) or the horsepower measured at the wheels. The difference is due to drivetrain losses and power used to drive non-engine components.
 See http://www.edmunds.com/car-technology/horsepower-gross-vs-re...
There's also the common saying "German horses are stronger" because acceleration and dyno tests (long before the turbo 335) have shown them to be faster than they "should" be, in particular to certain American muscle cars. I'm not sure how much reality there is behind this, though.
The Japanese "gentleman's agreement" was a limit on power of 207Kw. Towards the end, many of the cars hit by this were making more than that at the wheels, let alone the crank. It was possible to gain 50Kw on certain WRXes and Evos with a different air intake and exhaust, they were so restricted.
Also, for a while there was an "agreement" that sports bikes wouldn't go faster than 300km/h - they didn't limit them, the end of the speedo was 300km/h.
Lots of R1s and Hyabusas will be revving somewhere in the middle of their rev range at 300km/h in top gear, then you can wind on another 4-5-6 thousand rpm, while the speedo stays pegged at 300km/h. You are, of course, going faster than that.
Obviously you don't double the revs, but they for sure increase
If Samsung would really document the exact conditions for the activation and they would be repeatable and independently confirmed then I'd believe it's a real feature. Somehow I don't expect that I'll read that though.
But it's also true that the modern devices don't have constant power load and that it's a good thing and that regulatory bodies didn't complain for even things still in use: My old PC used 30 Watts when "powered off" (!) and my cable box does that still (!) same 30 Watts load either powered on or powered off. That's a real sucker. The TV set uses a tenth of Watt when powered off.
For example, usage based on input type (Component vs. HDMI):
TV (Wii as input, Component) 48 to 68W
TV (PVR Cable as input, HDMI) 84 to 104W
Then you start getting into different TV settings for games versus TV, day versus night, and the list goes on.
So what should the TV manufacturer put? Should they put the min consumption which is probably the dimmest setting? Or the highest? Or the average? It is an interesting question.
Which would lead TV manufacturers to introduce hacks that reduce power consumption solely for the demo mode.
(Also, note that demo mode typically tunes for most attention-grabbing in a showroom, which often does not provide the optimal viewing experience at home.)
The Volkswagen issue would have never happened. It also makes it much harder to game, anything that can be gamed in cutthroat competition will be.
Over time we could also see products that hold up through wear and tear. Consumers should demand this type of testing after the gaming has been exposed.
That's essentially what they do. The allegation is that this manufacturer's TVs recognize the specific clip used for testing and drop their brightness when they recognize it.
Also of how Samsung disabled speed scaling on their smartphone GPUs during benchmarks.
From their short description of "motion lighting", it isn't obvious to me why this is a feature in the first place.
In general all of these tests will "cheat" because there is no way around it, you can measure the TV usage while playing content from the built-in antenna with 50% brightness and 10% audio, and you can test it while streaming 4K video over wifi while recording a show from you setup box using the HDMI in to a USB attached storage at full brightness, in 3D 120hz mode.
The difference in the power usage in that case will be quite insane.
The problem begins with the whole grading system, people want to see the green A on anything they buy because they assume it will be of higher quality, this plus moronic testing and standards is what will push any manufacturer to cheat under any circumstances.
But when you brand TV's with AC's, Fridges, Washing Machines, Heaters etc. with exactly the same A++ to G when A it self isn't even green don't expect manufacturers to not cheat, it's a quite a big hit to get a yellow/orange/red marker on your product and it's not some minor detail that is hidden in small text it's marked very clearly usually it's bigger or as big as the image of the product on that details page/sticker they have by it on the shelf.
The power consumption of TV's is pretty much meaningless the powered consumption shouldn't be tested at all just the standby power which in some cases is quite insane even for A rated TV's, this isn't a multi KW device like an air conditioning or heating which has quite a bit of impact.
Oh and just to make it clearer they actually do not measure the standby power consumption of TV's at all for this rating which is actually probably several orders of magnitude more important because my Sony Bravia which is rated A+ barely powers down in standby mode vs lowest brightness since even in standby it's connected to the network and does some voodoo while it should be well off and it like probably every other TV out there is probably on only 10% of the time at most.
I almost always assume the opposite. I assume the product was designed in one of two ways:
1. Designed fully-featured and then hamstrung to meet arbitrary energy goals, or
2. Designed with energy goals as the first priority
in either case, other useful qualities (performance, longevity, etc) are second-class.
I think my mindset started with low-flow toilets being introduced, those things could barely flush an empty bowl of water. To me, energy-efficient means crap performance.
I had a Prius for a while and it was clear that a lot of things were sacrificed so they could spend the budget on energy efficiency. Compare the interior or ride quality of a Prius with anything else in its price range and you'll see what I mean.
It's just a design problem to work around.
(Yes, the Prius interior isn't well finished for its price, but they managed to make it roomy inside and small outside even under aerodynamic constraints.)
I bought a C rated fridge because I've read the testing procedure, it involves quite a bit of stuff like how long it takes it to cool down when the door is opened for 2 minutes, how much power it needs to cool down some volume of water etc, how much does it take to freeze X amount of what ever beef.
And i really don't care about that, I live with my GF, no kids, no one else, the fridge is opened maybe 5 times a day for no longer than 10 seconds. Dinner is cooked daily with "fresh" from the store ingredients or take away/delivery is used.
So the fridge is used to cool fruits/soda/water milk, and and some toppings for snaks.
The freezer will have some emergency ground beef for the shit it's too late to order pasta, some hot dogs when you really get lazy, some bread, and vodka.
If i had 3 kids and would stick a half eaten turkey into the fridge 3 times a week i would probably care more about how much power it yanks to cool it down, but with the use i have i don't need to care because while it's less efficient in cooling it's just as efficient at doing nothing for most of the day because the insulation is still the same.
But the majority of the public doesn't do it, almost no one is doing research before buying something, they might get some reviews of CNET at best but more likely got to Curry's or Fry's or Best Buy and buy something they think is good. And since everyone has been conditioned to see green as good and everyone is conditioned to think of a higher grade is better when you have for the most part a meaningless grading system people will still buy what has huge Grade A+++ green sticker on rather than a orange/red B or C.
I'm pretty satisfied with my Prius, personally. My mother's 2010 Chevy Impala may be a little roomier, but not enough for me to care.
The Sony TV i have it's stated power on consumption is 227 Watts (rating A/A+ for big TV's it's very easy because the bigger the screen and more features they have the bigger the reference power consumption they are measured to is), when you turn everything on it's actually higher by about 20-30W but it's standby power is pretty much the same because it's tuner is on, it's wifi is on, all it's smart hub features are on etc.
The 2nd TV i have is a 42" LG Smart TV it's also rated A+ but it's standby power consumption is nothing on it's 50 off it's 0.3.
So the Sony costs me a relative fortune a year to maintain while the LG is costs me pretty much just as much as if it would be sitting in it's box in the closet.
It basically limits standby or 'power off' power consumption to 0,50 W.
The EU isn't afraid to fine companies heavily, so chances are most equipment follows this rule. If not, I guess a few companies are sweating now that people start to question compliance of devices.
I also guess this directive has had implications all over the world, just like the ban of lead soldering in the EU did. once you have spent the money to design a low power solution, economies of scale do not make it worthwhile to keep an second design around for use outside of the EU.
I think 0,5W still is too much, though. With 20 plugged in devices (sounds like a lot, but count them in your house: coffee machine, water cooker, blender, audio system, televisions, computer, monitor for computer, external hard disk, printer, etc.), that still is a quarter of a kWh a day.
Certainly, home automation systems should aim to go way lower per power outlet.
Now it sounds like this was to comply with this inane EU directive. Aren't there some lower hanging fruit to pick like major appliances before hamstringing comparatively lower users like consumer electronics?
This is one of those problems that yields to regulatory pressure. With some pushing, a solution like  below becomes standard, and adds about a dime of parts cost to everything that plugs in, probably paying for itself in the first month of usage. Without such pressure, all the crap power supplies don' have it.
Also, one could argue that this is more important for the smaller items. Most people will have way more of them, and having, say, a 5W power drain, 24 hours a day, on a €30,— toaster that you use for, rounding up, 1 hour a week, relatively is a much larger waste, and feels way worse than having that same drain on a €1000,— television that you use, rounding down, for 2 hours every day.
End result, happened to a friend: kitty knocks over a powerstrip, an adapter breaks down internally due to impact, causes fire. Thankfully he had a smoke detector.
Slightly different media, same idea. Check for testing criteria, vary the output.
I'm sure members of the whole political spectrum can agree that uninhibited full-disclosure would benefit us all (except lazy/ malicious/ stupid manufacturers).
Maybe we’ll even find out that iPhones don’t even contain an Apple?
Seems to apply also to tests...
If I specify "16x Anisotropic filtering" and the driver changes that to "4x" or some other number behind the scenes, that's an unacceptable optimization.
The current "GeForce Experience" program that selects optimal settings for the current card is an acceptable optimization, I can see which settings are configured and change them if I like.
[*] I don't remember the specific things they were adjusting.