Hacker News new | past | comments | ask | show | jobs | submit login
Thinking About Smart Home Power Usage (blakeniemyjski.com)
38 points by niemyjski 6 days ago | hide | past | web | favorite | 51 comments





> The real shocker was when I decided to measure only the Wemo mini smart plug with nothing connected was costing me $0.31/day.

I can't buy this. That's 3,100 Wh each day, or roughly 130W constant usage. That much energy being dissipated in the plastic housing would be burning hot to the touch, or melt it.

I'd wager he's off by a couple orders of magnitude. 1.3W is much more likely for a smart plug to draw.

The 55W his Echo is drawing is suspect for the same reason. That's more power than almost any idling laptop.


He has to be off, way off. He's not using $4/mo on an Echo at 10c/kWh. We pay more than double that rate and our electric bills were frequently under $50/mo before we got an EV.

The one thing I learned using my kill-a-watt a few years back, is that "vampire draw" is likely something that only applies to VCRs from the 80s. I plugged it into a power strip full of USB and Apple chargers, and the consumption over a week is stunningly close to 0.

Every small device I measured was utterly insignificant, despite all the articles over the past decade telling you to plug all your chargers into a power strip and flip it off when not in use.

Bigger things I can understand. Our "smart TV" almost certainly draws tens of watts in spying^H^H^H^H^H^H standby mode.


To calculate the math more correctly, a 1W smart plug is contributing $0.09/month to the $400 electrical bill quoted. This is a stellar example of a micro-optimization, in programming terms.

Focusing on big spenders like "anything with a pump" and "anything that generates heat" would contribute significantly to reductions.

I once upgraded video cards, reducing my computer's power draw by 180W and lowering my electric bill accordingly. This was back in the early days of CFLs (they were terrible) so it may not seem like much, but it was on 24/7 and that was the power at idle. I think it worked out to something like $10/month of savings, which was quite noticeable for my total bill of $50 or so.


> ... and "anything that generates heat" would contribute significantly to reductions.

I never measured it, but there was one area that I quickly put a Tasmota plug on. My stereo receiver under the TV.

One day I was putting around the den and I noticed a lot of heat emanating from the credenza under the TV. It was my Sony receiver just sitting there, dutifully amplifying the null input from the turned-off TV.

So I added the plug, and told Home Assistant to turn the receiver on whenever the TV was on, and turn it off when the TV is off. My guess is that I'm saving 20-40W this way. $2-$4 a month.

(FWIW, this is the first time I've ever been glad to have a "smart" TV. The only feature I use on that TV is the ability for Home Assistant to see it)


Home Assistant is great! I flashed a cheap ES8266 RGB Led controller with ESPruna firmware and hooked it up to HA over MQTT. What are some of your hardware recommendations? I also have a Zigbee/Z-Wave dongle for the RPi, so I'm not limited to WiFi only.

I use a Conbee II dongle with deCONZ for all my Zigbee stuff, and try to use Zigbee wherever possible.

The Hue bulbs are great, and also act as repeaters, so I have those sprinkled throughout the house. But they're expensive, so I used Sengled bulbs for most of my fixtures. The Sengleds are endpoint-only, though. So I mix those two bulbs to keep the mesh dense but also save some money.

All my door sensors are the SmartThings units. I use a mix of SmartThings and IKEA Trådfri motion sensors. One SmartThings water leak detector in the basement, a SmartThings button, and a Trådfri 5-button remote.

I started this just to have a convenient way to control lights. My house is really old, and almost all lighting is table- and floor-lamps that I would have to walk around and turn on and off. Or ceiling lamps with pull cords.

But after getting into Home Assistant, I started going a little nuts with the automations.

My favorite one: A month ago I was getting ready for bed when I realized I left my oven on all day long. For 12 hours it was keeping the oven at 375˚F. But then I realized that I already have a motion sensor in the kitchen, and if I just moved it to the door of the oven, the built-in temperature sensor could be used to remind me that the oven was left on. So now I have a rule in my automations.yaml: If the kitchen temperature exceeds the living room temperature by more than 10˚F for more than an hour, send a notification to my phone that I left the oven on.


Alternatively, you can get a $20 power bar[1] that has one outlet that determines the on/off state of the remaining plugs based on the current draw of that one outlet. No Home Assistant required.

I use one to power off a bunch of devices when my computer sleeps.

1: https://www.amazon.ca/gp/product/B01G6VTIDG/ref=ppx_yo_dt_b_...


I have an old radio shack power strip that shuts on/off according to a power draw in the first socket. Not sure, but is likely someone else makes one of these.

I use an amplifier as a small space heater for a particularly cold corner of my home :)

1.5W is cheap to operate, but my god, for the job done that seems like a relative power glutton. Entire x86 processors can operate, not idle, at 5W.

My favorite story on the subject of relative waste:

https://reductionrevolution.com.au/blogs/news-reviews/584256...


Based on the photo in the post, the Kill A Watt is calculating his costs. He doesn't show the actual wattage reading from the Kill A Watt, so it seems to me like the price is set incorrectly on the Kill A Watt.

-- Addendum

I happened to have a Kill A Watt and Wemo lying around, and with no devices attached, the Watt reading ranges between 0.5W to 0.7W with no device attached and the Wemo set to the OFF state. When the Wemo is set to the ON state with no devices, the reading ranges from 1.2W to 1.3W, which is a noticeable jump.


Yeah this is likely caused by the poor accuracy of the current sensor at very low power and switch mode power supplies can mess with the way the data is sampled.

We used to use those Kill-a-Watt probes at work to classify smart home device power draw.

Emphasis on "used to". They are not very accurate at all, and the sampling interval over which they take data on wall power draw is slow enough that you will miss important changes in current draw on your DUT. We have since switched over to expensive, calibrated Agilent power analyzers with better time resolution.

The post is well intentioned, but I challenge his data collection methods. His tools are not up to the task he has set himself to. It is not nearly as simple as the folks who sell Kill-a-Watt's would have you believe.


A EE video[1] shows such a bad power factor on a mere smoke alarm that the VA reading is 20VA for less than 1W actual consumption. An unsuspecting "power meter" might overestimate actual dissipation.

[1] https://www.youtube.com/watch?v=_kI8ySvNPdQ


Do you have some of that data available? I think his data does seem off, but all these always on devices have to be costing something right? Curious what that cost actually is.

> but all these always on devices have to be costing something right?

They probably are; that depends on how badly they're designed. But that doesn't mean they should.

Consumer devices aren't usually as efficient as they could be with some more design work, which I think makes people have a wrong reference point about how much power is needed to do things. As a counterexample and a way to reset the reference, consider e.g.:

- That there exist radio devices that are designed to run for years off a single CR2032 battery.

- That there are microcontrollers that can still execute your code while drawing nanoamps.

- My 9 m.o. kid has a plush moon with a string attached to it; when you pull it, it plays a loud melody (that part is mechanical) and flashes LEDs for ~30 seconds. Both are powered from the mechanical energy of your pull.

The way I see it, a typical device on standby and/or a typical wall wart not charging anything shouldn't pull more than some micro- or even nanowatts. So they shouldn't cost you more than a hundredth of a cent a month each. Now of course they do, there are probably some engineering constraints here (like more complicated devices wanting to keep RAM powered), and there are definitely business constraints (low-power design is more expensive). But to me, a device whose standby mode is noticeable on the power bill is simply broken.


I don't - it was collected for work, and I'm not allowed to share it.

My utility also installed smart meters recently - the Focus AXRe with Gridstream RF. It theoretically has Zigbee in it. It even has the Zigbee logo on it. I haven't figured out if I can read it yet, but in fairness I also haven't put a lot of work into figuring it out either. At the very least I could not get Smartthings to see it. So if anybody knows anything more about this meter I would love to know. :)

But I DID discover that my utility, buried down in like 5 menus on some random screen of the account portal, offers a usage graph at 15 minute intervals. It's not real time - it seems to be delayed by 1-3 hours - but it is far better than getting a surprise bill. And while it used some weird SAP JSON interface, I could deduce what was what and could get the data out of it.

So I whipped up a script to basically scrape this "API" and shove the data into InfluxDB. I also added daily scrapes for the billing page and the rate page so I know when I was billed and how much the current rate is. This is because my utility bills at a lower rate for the first 1,400 kwh, and a higher rate for everything over that. I was not able to discern any pattern to when the bills were issued and the utility company was very unhelpful in this regard, just "sometime ever 27 to 35 days depending on holidays and weekends."

This [0] is the result of combining all the data. While I would really like realtime data directly off the meter, even being delayed a few hours is better than a random surprise $370 bill. I've written enough scrapers in my life to know it will probably break at some point, but it's been humming along nicely for the last few months.

[0] https://imgur.com/a/SwdHnCV


I couldn't find anything called a "Focus AXRe", looking at a few different products that came up for "Gridstream RF" the compliance docs only list that they're zigbee coordinators, not endpoint devices. ex: https://zigbeealliance.org/zigbee_products/gridstream-rf-enh... see 8.1 on pg 7. AFAIU, that means it expects to control other zigbee devices, not report to coordinators like the ST hub.

You could see if you can find your actual meter here: https://zigbeealliance.org/zigbee_products/?product_type=cer...


In the Netherlands when smart meters where introduced one of the requirements was that the meters should have a port with a defined standard (DSMR[0]) which could be used by the house occupant to read metrics for themselved. The standard is pretty well setup with newer versions giving per second power/gas usage readings in human parsable ascii over rs232 on an rj11 connector.

[0] https://www.netbeheernederland.nl/_upload/Files/Slimme_meter...


I have an implementation [0] of DSMR for esp8266 that I integrated to home assistant over MQTT.

[0] - https://github.com/WhoSayIn/esp8266_dsmr2mqtt


Seems stunning that you could charge an electric car for 8 months without realizing you're not on a TOU plan.

We've got an EV and rooftop solar, and while I knew it was going to be an arbitrage scheme to begin with (with Net Metering, you don't use your solar to charge your car, you absolutely don't want to charge your car while the sun is shining), it was surprising to me to what extent usage had nothing to do with production. They're 2 entirely different things you treat entirely separately.

I thought I understood NEM2.0 pretty well (here in CA). Instead of the old NEM1.0, where 1kWh of generation = 1kWh of usage credit, it's bucketed by time of day. I actually thought NEM2.0 would be better for us than NEM1.0 -- we have generous peak hours (1pm-7pm) so we generate a TON of electricity off the roof at up to 38c/kWh during the day, and consume most of our power at much lower rates overnight. So 1kWh generated might be worth 2kWh into the car!

What I didn't factor in was that NEM2.0 credits are bucketed by month, and the consequences of that.

Yes, it means we generate very little in the winter, while our usage is just as high as in the summer, but that would wash out over the course of a year.. except for tiering! It hadn't crossed my mind how much tier 2+ usage we'd have in the winter. Which means it actually can pay off to charge the car at work in the winter, while it's pointless in the summer.

We're on a grandfathered TOU plan that I wanted to keep for those generous generation credits, but I'm thinking of moving to an EV plan because the much cheaper charging rates might offset the more-generous peak hour timing and credits. (peak would be 4-9pm though, which means we generate less during peak, and we also use more at peak, since typically we get home around 7pm, all of our evening usage would still be at peak).


The problem of saving power with small household devices is similar to the problem with micro-transactions. The overhead of connecting devices into a smart system is far greater than the recovered cost in terms of both money and energy. A smart switch can easily run to $100 to purchase and install and $10 a year to run. So it needs to save you ~$20 a year to be worthwhile. For large appliances that are regularly used such as swimming pool pumps, air conditioners, scooter or car chargers then it can be worthwhile.

I'd like to see smart blinds and curtains become more popular as windows can act like a giant highly efficient solar panel heating your home up when the conditions are right. Sadly again the cost of setting up and maintaining such a system could be ~$100 a year so only certain buildings would actually see a return on investment.


'Dumb' timers and thermostats have existed for a long time, the value of something smarter isn't necessarily about reducing power consumption.

I put smart switches on my exterior lights to run them on a schedule. The value proposition was 'People don't arrive to darkness because we rarely remember to turn them on.' I'm trading an increase in consumption for utility.

My 'dumb' pool pump timer died and I replaced it with a more expensive smart model. Now, in addition to running the schedule, the pump also runs when the temperature approaches or drops below freezing. It may be running more or less than under the 'I manually override the timer when it's unusually cold and I remember' scenario but the value proposition was 'Prevent a pipe burst because I forgot to turn it on.'


While they may not actually save money over lifetime, its possible they do reduce your carbon footprint over their lifetime. I am actually curious about that side of things. The devices obviously cost some amount of resources and money, does their electricity savings every offset their initial creation.

https://permies.com/w/better-world : this covers this exact topic... in several different ways.

Most of the electricity being used in the Wisconsin winter is going to be for heating. The best dollar ROI is typically weather stripping and insulating followed by using a heat pump ( preferably ground looped ), and of course the best choice might just be a rocket mass heater.

For passive solar options : https://www.builditsolar.com/


We recently installed two Curbs (energycurb.com) in our house and it's not only been very educating but also made it possible to zero in on the largest consumers. Upgrading the water heater to a heat pump unit took us from barely breaking even on our PV to a daily surplus of about 3kWh. The next largest consumers are the file server running 24/7 in the basement and my work machine.

Side note: Has Disqus ever worked, for anyone?

I've tried dozens of times over the past few years. It never provides a 'logged in' comment box. Once I login, if i try to login again, it complains about a form submission error. I've never bothered to troubleshoot because mostly I want to avoid using Disqus but a couple times a year I decide to give it a go, and.. nothing. I have or had an account, which doesn't seem to work. I've also tried to use Twitter integration, which seems the least-bad (because my twitter account is rarely used for anything anyway).


You can see the Disqus comments, but if you post a comment, then, that won't work?

If you have time, maybe you'd like to go here: https://www.kajmagnus.blog/new-embedded-comments and scroll down and try posting a comment, and tell me if you run into problems? (That's a different commenting system; I'm developing it. And it'd be nice to find out if there are unknown problems preventing people like you from posting comments ... Maybe the same thing as with Disqus, and I could fix the problem in my case?)


Sounds like an adblocker or something blocking a resource that Disqus "requires" in order to function.

I don't run 'em, but could be something built into Safari. I always figure "oh, maybe it's an early product, they'll fix it".. but disqus has been around awhile :)

I never understood the claim that smart switches on LED lightbulbs will conserve energy. The duty cycle of the average light bulb is fairly low; I would guess they are switched on for <10% of the time, on average.

If so, adding a ‘smart switch’ that uses 0.5W to a lighting fixture that has a 10W LED light bulb increases its average power usage by about 50%. You must be fairly sloppy to make that a gain, compared to a manually operated switch.


Lots of people have gotten lazy about turning off lights since LED is so cheap. E.g. I think our kitchen light is on around 12 hours per day because the switch isn't very convenient. Or the garage light - if it's left on during the day, it probably won't be noticed until morning.

Also kids tend not to turn lights off.

But yeah, even if you are saving money, smart switches are pretty expensive. It will take a while to recuperate the extra $ and time invested.

I would focus on switches that control several lights at once. E.g. our garage lights use 100 watts in all. Kitchen uses 65. Hallway 50.


> Also kids tend not to turn lights off.

Teach your kids ? What is this laziness of saying "kids won't do X" ?


Its called reality. Kids are people, not machines. Parenting is more comparable to pushing on a string than writing a program.

I've seen some people spend 17 years (and counting) trying to teach their kids to turn off the lights.

Much less your average 5 year old, most of which probably lack the mental awareness to always remember to turn off the light when they leave a room. Hell, my 5 year old almost ran barefoot over shattered glass because she really wanted to throw something in the trash - she forgot the glass was there about 10 seconds after we told her not to come over here while I cleaned it up.


One thing I noticed since installing smart bulbs all around my house is how little light I actually need most of the time. In the evenings I usually run them at 5% brightness. Don't know how that scales consumption-wise though.

> adding a ‘smart switch’ that uses 0.5W to a lighting fixture that has a 10W LED light bulb increases its average power usage by about 50%

Shouldn't that be 5%, not 50%?


they are assuming the 10W is only running 10% of the time and the smart switch is running all of the time. That puts the average use for the light at 1W and 0.5W for the smart switch.

Ah, got it, thanks!

Incidentally, I was wondering how a "smart" home could centralize power distribution to use less energy.

Yes, the hub would be always on, but you get to kill every sleeping device, such as the TV, "smart" lightbulbs by making the hub control them directly, phone chargers and various transformers plugged in the outlet, idling computers, etc.

Just ask the hub to turn on your computer, for instance.


The real usage is DNS. I set up a PiHole last week and... Wow. 70% of requests are my fiance's echos (my Google devices, which I am trying to convince her with, are below the noise floor, go figure).

The privacy concerns are real, but I actually use smart assistants a good deal of time when I am at home.


In winter, assuming you have a gas-powered furnace, isn't this "wasted" electrical usage environmentally beneficial, by reducing your use of natural gas for heating?

Are you in the Milwaukee area by chance? Funny that you post this today, We Energies just called me today to notify me they were installing a new meter for our unit.

How do you turn off smart bulbs which require power to retain connectivity settings/configuration?

My Hue smart bulbs retain connectivity and configuration settings when powered off (via the light switch). If they didn't it would be awful -- you would have to reconnect and reconfigure every bulb in the house after each power outage. I assume the data is stored in some kind of flash memory.

My personal guess would be that the settings are stored either on the Hue hub or within the app you use to connect to it. Mostly because, I've noticed, after someone turns off my Hue bulbs by accident by flipping the switch and then turns them on, they don't get restored to the same scene setting as they were before. I have to manually change back to the scene it was on before being powered off.

On another hand, after reading some articles a while ago that mentioned security issues surrounding smart bulbs (when it comes to selling them to someone after using them in your own home due to some persisted settings), I bet that your theory could be correct as well.


That changed with updated bulb firmware. Initially bulbs would reset to their default state on power on, but they're now able to save state. The way I read it this behavior is not dependent upon the hub or app.

The Phillips app is so junky I just use the hard power switches somewhat frequently.

https://huehomelighting.com/new-power-feature-to-retain-colo...


Luckily, their protocol is interoperable, so I don't think you are required to use their app, even for their initial setup (not 100% sure on the last part, but I am about 90% there).

Their lights work great with HomeKit, as well as many other devices do. So instead of having a separate app for Kasa switches, app for Hue, app for LiFX, and an app for Nanoleaf, I can control them all straight from HomeKit. Considering that this is easily doable, it seems totally possible to do this with something other than just HomeKit as well.


Their lights work great with HomeKit, as well as many other devices do. So instead of having a separate app for Kasa switches, app for Hue, app for LiFX, and an app for Nanoleaf, I can control them all straight from HomeKit. Considering that this is easily doable, it seems totally possible to do this with something other than just HomeKit as well.

Ehhh... I can't speak to the rest of the HomeKit integration but the Siri stuff isn't so great either. First it required upgrading the hub, it still only works with Philips bulbs (even though the Hue can control other smart bulbs like IKEA whatevers). After setting it up I still get "device not responding" responses from Siri at least 1/5th of the time.

Using a different app would potentially improve the problems between the hub and phone, but not between the hub and the zigbee devices.


Yeah I hacked around on the Hue REST API a few years back and this is how it works.

I had this scenario too. I purchased PowerTag by a Schneider Electric and then tied this into Home Assistant via Node-red. This gave me usage on individual circuits and automation resulted in a 40% reduction in energy usage.

For curiosity I wired in an SDM eBay brand meter and it was within 1% of the Industrial grade PowerTag.

Another option is emonpi and it’s easier to install than both of the above.

Happy metering! (Oh and you’re lucky, Power in Australia is $0.28/kWh




Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: