I can't buy this. That's 3,100 Wh each day, or roughly 130W constant usage. That much energy being dissipated in the plastic housing would be burning hot to the touch, or melt it.
I'd wager he's off by a couple orders of magnitude. 1.3W is much more likely for a smart plug to draw.
The 55W his Echo is drawing is suspect for the same reason. That's more power than almost any idling laptop.
The one thing I learned using my kill-a-watt a few years back, is that "vampire draw" is likely something that only applies to VCRs from the 80s. I plugged it into a power strip full of USB and Apple chargers, and the consumption over a week is stunningly close to 0.
Every small device I measured was utterly insignificant, despite all the articles over the past decade telling you to plug all your chargers into a power strip and flip it off when not in use.
Bigger things I can understand. Our "smart TV" almost certainly draws tens of watts in spying^H^H^H^H^H^H standby mode.
Focusing on big spenders like "anything with a pump" and "anything that generates heat" would contribute significantly to reductions.
I once upgraded video cards, reducing my computer's power draw by 180W and lowering my electric bill accordingly. This was back in the early days of CFLs (they were terrible) so it may not seem like much, but it was on 24/7 and that was the power at idle. I think it worked out to something like $10/month of savings, which was quite noticeable for my total bill of $50 or so.
I never measured it, but there was one area that I quickly put a Tasmota plug on. My stereo receiver under the TV.
One day I was putting around the den and I noticed a lot of heat emanating from the credenza under the TV. It was my Sony receiver just sitting there, dutifully amplifying the null input from the turned-off TV.
So I added the plug, and told Home Assistant to turn the receiver on whenever the TV was on, and turn it off when the TV is off. My guess is that I'm saving 20-40W this way. $2-$4 a month.
(FWIW, this is the first time I've ever been glad to have a "smart" TV. The only feature I use on that TV is the ability for Home Assistant to see it)
The Hue bulbs are great, and also act as repeaters, so I have those sprinkled throughout the house. But they're expensive, so I used Sengled bulbs for most of my fixtures. The Sengleds are endpoint-only, though. So I mix those two bulbs to keep the mesh dense but also save some money.
All my door sensors are the SmartThings units. I use a mix of SmartThings and IKEA Trådfri motion sensors. One SmartThings water leak detector in the basement, a SmartThings button, and a Trådfri 5-button remote.
I started this just to have a convenient way to control lights. My house is really old, and almost all lighting is table- and floor-lamps that I would have to walk around and turn on and off. Or ceiling lamps with pull cords.
But after getting into Home Assistant, I started going a little nuts with the automations.
My favorite one: A month ago I was getting ready for bed when I realized I left my oven on all day long. For 12 hours it was keeping the oven at 375˚F. But then I realized that I already have a motion sensor in the kitchen, and if I just moved it to the door of the oven, the built-in temperature sensor could be used to remind me that the oven was left on. So now I have a rule in my automations.yaml: If the kitchen temperature exceeds the living room temperature by more than 10˚F for more than an hour, send a notification to my phone that I left the oven on.
I use one to power off a bunch of devices when my computer sleeps.
My favorite story on the subject of relative waste:
I happened to have a Kill A Watt and Wemo lying around, and with no devices attached, the Watt reading ranges between 0.5W to 0.7W with no device attached and the Wemo set to the OFF state. When the Wemo is set to the ON state with no devices, the reading ranges from 1.2W to 1.3W, which is a noticeable jump.
Emphasis on "used to". They are not very accurate at all, and the sampling interval over which they take data on wall power draw is slow enough that you will miss important changes in current draw on your DUT. We have since switched over to expensive, calibrated Agilent power analyzers with better time resolution.
The post is well intentioned, but I challenge his data collection methods. His tools are not up to the task he has set himself to. It is not nearly as simple as the folks who sell Kill-a-Watt's would have you believe.
They probably are; that depends on how badly they're designed. But that doesn't mean they should.
Consumer devices aren't usually as efficient as they could be with some more design work, which I think makes people have a wrong reference point about how much power is needed to do things. As a counterexample and a way to reset the reference, consider e.g.:
- That there exist radio devices that are designed to run for years off a single CR2032 battery.
- That there are microcontrollers that can still execute your code while drawing nanoamps.
- My 9 m.o. kid has a plush moon with a string attached to it; when you pull it, it plays a loud melody (that part is mechanical) and flashes LEDs for ~30 seconds. Both are powered from the mechanical energy of your pull.
The way I see it, a typical device on standby and/or a typical wall wart not charging anything shouldn't pull more than some micro- or even nanowatts. So they shouldn't cost you more than a hundredth of a cent a month each. Now of course they do, there are probably some engineering constraints here (like more complicated devices wanting to keep RAM powered), and there are definitely business constraints (low-power design is more expensive). But to me, a device whose standby mode is noticeable on the power bill is simply broken.
But I DID discover that my utility, buried down in like 5 menus on some random screen of the account portal, offers a usage graph at 15 minute intervals. It's not real time - it seems to be delayed by 1-3 hours - but it is far better than getting a surprise bill. And while it used some weird SAP JSON interface, I could deduce what was what and could get the data out of it.
So I whipped up a script to basically scrape this "API" and shove the data into InfluxDB. I also added daily scrapes for the billing page and the rate page so I know when I was billed and how much the current rate is. This is because my utility bills at a lower rate for the first 1,400 kwh, and a higher rate for everything over that. I was not able to discern any pattern to when the bills were issued and the utility company was very unhelpful in this regard, just "sometime ever 27 to 35 days depending on holidays and weekends."
This  is the result of combining all the data. While I would really like realtime data directly off the meter, even being delayed a few hours is better than a random surprise $370 bill. I've written enough scrapers in my life to know it will probably break at some point, but it's been humming along nicely for the last few months.
You could see if you can find your actual meter here: https://zigbeealliance.org/zigbee_products/?product_type=cer...
 - https://github.com/WhoSayIn/esp8266_dsmr2mqtt
We've got an EV and rooftop solar, and while I knew it was going to be an arbitrage scheme to begin with (with Net Metering, you don't use your solar to charge your car, you absolutely don't want to charge your car while the sun is shining), it was surprising to me to what extent usage had nothing to do with production. They're 2 entirely different things you treat entirely separately.
I thought I understood NEM2.0 pretty well (here in CA). Instead of the old NEM1.0, where 1kWh of generation = 1kWh of usage credit, it's bucketed by time of day. I actually thought NEM2.0 would be better for us than NEM1.0 -- we have generous peak hours (1pm-7pm) so we generate a TON of electricity off the roof at up to 38c/kWh during the day, and consume most of our power at much lower rates overnight. So 1kWh generated might be worth 2kWh into the car!
What I didn't factor in was that NEM2.0 credits are bucketed by month, and the consequences of that.
Yes, it means we generate very little in the winter, while our usage is just as high as in the summer, but that would wash out over the course of a year.. except for tiering! It hadn't crossed my mind how much tier 2+ usage we'd have in the winter. Which means it actually can pay off to charge the car at work in the winter, while it's pointless in the summer.
We're on a grandfathered TOU plan that I wanted to keep for those generous generation credits, but I'm thinking of moving to an EV plan because the much cheaper charging rates might offset the more-generous peak hour timing and credits. (peak would be 4-9pm though, which means we generate less during peak, and we also use more at peak, since typically we get home around 7pm, all of our evening usage would still be at peak).
I'd like to see smart blinds and curtains become more popular as windows can act like a giant highly efficient solar panel heating your home up when the conditions are right. Sadly again the cost of setting up and maintaining such a system could be ~$100 a year so only certain buildings would actually see a return on investment.
I put smart switches on my exterior lights to run them on a schedule. The value proposition was 'People don't arrive to darkness because we rarely remember to turn them on.' I'm trading an increase in consumption for utility.
My 'dumb' pool pump timer died and I replaced it with a more expensive smart model. Now, in addition to running the schedule, the pump also runs when the temperature approaches or drops below freezing. It may be running more or less than under the 'I manually override the timer when it's unusually cold and I remember' scenario but the value proposition was 'Prevent a pipe burst because I forgot to turn it on.'
Most of the electricity being used in the Wisconsin winter is going to be for heating. The best dollar ROI is typically weather stripping and insulating followed by using a heat pump ( preferably ground looped ), and of course the best choice might just be a rocket mass heater.
For passive solar options : https://www.builditsolar.com/
I've tried dozens of times over the past few years. It never provides a 'logged in' comment box. Once I login, if i try to login again, it complains about a form submission error. I've never bothered to troubleshoot because mostly I want to avoid using Disqus but a couple times a year I decide to give it a go, and.. nothing. I have or had an account, which doesn't seem to work. I've also tried to use Twitter integration, which seems the least-bad (because my twitter account is rarely used for anything anyway).
If you have time, maybe you'd like to go here: https://www.kajmagnus.blog/new-embedded-comments and scroll down and try posting a comment, and tell me if you run into problems? (That's a different commenting system; I'm developing it. And it'd be nice to find out if there are unknown problems preventing people like you from posting comments ... Maybe the same thing as with Disqus, and I could fix the problem in my case?)
If so, adding a ‘smart switch’ that uses 0.5W to a lighting fixture that has a 10W LED light bulb increases its average power usage by about 50%. You must be fairly sloppy to make that a gain, compared to a manually operated switch.
Also kids tend not to turn lights off.
But yeah, even if you are saving money, smart switches are pretty expensive. It will take a while to recuperate the extra $ and time invested.
I would focus on switches that control several lights at once. E.g. our garage lights use 100 watts in all. Kitchen uses 65. Hallway 50.
Teach your kids ? What is this laziness of saying "kids won't do X" ?
I've seen some people spend 17 years (and counting) trying to teach their kids to turn off the lights.
Much less your average 5 year old, most of which probably lack the mental awareness to always remember to turn off the light when they leave a room. Hell, my 5 year old almost ran barefoot over shattered glass because she really wanted to throw something in the trash - she forgot the glass was there about 10 seconds after we told her not to come over here while I cleaned it up.
Shouldn't that be 5%, not 50%?
Yes, the hub would be always on, but you get to kill every sleeping device, such as the TV, "smart" lightbulbs by making the hub control them directly, phone chargers and various transformers plugged in the outlet, idling computers, etc.
Just ask the hub to turn on your computer, for instance.
The privacy concerns are real, but I actually use smart assistants a good deal of time when I am at home.
On another hand, after reading some articles a while ago that mentioned security issues surrounding smart bulbs (when it comes to selling them to someone after using them in your own home due to some persisted settings), I bet that your theory could be correct as well.
The Phillips app is so junky I just use the hard power switches somewhat frequently.
Their lights work great with HomeKit, as well as many other devices do. So instead of having a separate app for Kasa switches, app for Hue, app for LiFX, and an app for Nanoleaf, I can control them all straight from HomeKit. Considering that this is easily doable, it seems totally possible to do this with something other than just HomeKit as well.
Ehhh... I can't speak to the rest of the HomeKit integration but the Siri stuff isn't so great either. First it required upgrading the hub, it still only works with Philips bulbs (even though the Hue can control other smart bulbs like IKEA whatevers). After setting it up I still get "device not responding" responses from Siri at least 1/5th of the time.
Using a different app would potentially improve the problems between the hub and phone, but not between the hub and the zigbee devices.
For curiosity I wired in an SDM eBay brand meter and it was within 1% of the Industrial grade PowerTag.
Another option is emonpi and it’s easier to install than both of the above.
(Oh and you’re lucky, Power in Australia is $0.28/kWh