Hacker News new | past | comments | ask | show | jobs | submit login
Pre-built solar homes could make renewable energy almost 50% cheaper (inverse.com)
196 points by elorant 17 days ago | hide | past | web | favorite | 142 comments

I'm totally a layperson on the subject but the other day I thought about how far more devices in my house use DC than AC than 30 years ago. It seems odd that I convert solar to AC then back to DC.

I wonder if there's any future to running both an AC and a DC bus through the house, or if that makes no sense for reasons beyond my basic understanding.

A friend of mine did something similar. He put in a small solar array with only 3 panels, an inverter/charger, and 400 Ah of 12-volt batteries. This DC then feeds into a breaker panel and goes to lights in all the rooms. He uses LED spotlights, the little bulbs that plug into wires stretched across the ceiling. He said he just ran 10-gauge Romex wires through the attic, down to DC-rated switches and then to the ends of the cables.

He still has all the original 110 wiring and fixtures in the house, but this battery-backed solar system runs all his lights. It can easily go 2 days without sun before he needs to use the regular lights. Another benefit is that DC LED bulbs are by definition flicker-free, and also low consumption (4-6 watts each) but bright. He calculated he can light each room better than before with 6-8 bulbs, so 40-50 watts, thus only 4-5 Amps (@ 12 VDC) to each room.

All the big appliances in the house still run off of of AC, but he does have a 110 inverter so that he could run a critical appliance such as a fridge during an extended outage.

So this little system replaces 1-2 kWh of incandescent lighting per day (CFC or LEDs on 110 would be less, but not flicker-free), while providing lights during outages. He says it's not cost-effective (the usage displaced by solar and battery usage doesn't cover the cost of the system), but the backup powers is worth it to him.

I am actually thinking of something similar. Instead of 12v DC I probably would go for 48v or maybe even higher. With 12v your cables soon start to become too thick to be practical if you want to run serious things like garage doors / window shades etc.

My friend said he picked 12V because that's what the easily-available LED bulbs are rated at, and he just wanted a simple lights-only system.

I guess you could pick whatever voltage you want for your loads and cable runs, but you need to have the appliances you want available (or easily DIY) in that voltage. It would probably be possible to convert down the voltage once, but any more than that and you're introducing loss again.

120VAC LED lamps will just rectify that 120V to DC and then use a power supply to drop it to around 30V.

In other words, a 120VAC LED should work just fine on 48VDC with the same output.

It’s also explains why they don’t dim.

(Obviously avoid the dimmable ones in this application!)

On the other hand, you could connect 12V lights in series to get them the desired voltage.

That’s not always a good idea.

It is putting more current through each light.

And particularly problematic if each bulb has parallel LEDs and they start burning out: then more and more current is put through the remaining LEDs.

> Another benefit is that DC LED bulbs are by definition flicker-free, and also low consumption (4-6 watts each) but bright.

You still need PWM to control brightness, no? I'd you're using resistors efficiency goes down a lot.

Sure, but if starting with DC you can pulse on the order of hundreds if not thousands of Hz instead of 50/60Hz, which shouldn’t be noticeable.

Our Philips brand 120v AC LED lighting runs about 10watts per, is flicker-free. It's not 6W but I think that is largely due to the higher light output at max brightness (~800 lumen). Also has the advantage of using existing wiring. Integration with the voice assistant software and color modes has been a nice added benefit (ignoring privacy stuff).

12 volt wiring in homes isn't terribly efficient as once you get past 6-10 feet you need really expensive (heavy, thick) wires to not have much voltage drop. Most new boats use 24 or 48v for long haul and then local step down to 12v where needed (kitchen, living, sleeping areas etc).

You can find 120V LED lighting that is flicker-free, eg. anything sold under the Philips brand (in my experience).

Power losses in wiring are an issue. Many of your DC devices also happen to be low-voltage. (For example, USB devices.)

If you transmit the same number of watts at a higher voltage (lower current) rather than at a lower voltage (higher current) through a wire, you will lose less power.

So for moving the power around the house at distances more than around 10 feet (~3 meters), it's more energy-efficient to convert it near the point of use. And if you're already going to convert it, it's probably not worth the expense and complication of running two separate wiring systems.

The USB power supply is probably a 5-volt switchmode power supply which doesn't care whether its input is 80 volts or 300 volts, or ac or dc. You could very reasonably run 100 volts dc or 200 volts dc to things that use switchmode power supplies like that.

A normal worldwide USB power supply accepts 86-240v AC, so you can usually supply a DC voltage on the same terminals of the same voltage with no issue (convert from AC peak to peak to DC)

It has less to do with ac vs dc and more to do with the voltage and distance from source to load. If you wanted a dc bus inside a house you would need to have voltages in the same range as ac power 120/240V in North America. At this point you have the same challenges of how do you reduce the voltage to something low and easy to work with for an end device like a light bulb or cell phone. Stepping down voltage is easy to do with AC power but difficult for DC which is why I don’t think it will ever be practical.

48v dc bus and buck converters work just fine. generally power supplies that work off AC rectify and go through the same business - except they need larger inductors to sort out the zero crossing.

i think the difference you're looking for is in volume of copper.

Maybe we should instead invent a domestic 3 phase distribution system. No more zero crossings requiring energy storage, just flat constant power. It would require a new, small four conductor plug to be designed. Total volume of copper would be a less for the same available power, since the three wires deliver twice the power as two single phase wires for the same conductor size. There would be more insulation involved so the wiring might grow or might shrink if outlets were held to 1500watts.

The big savings would be in longer lived electronic devices. A sizable portion of digital electronics ultimately fail because the capacitors in their power supplies exceed their designed lifetime. Electrolytic capacitors (ticking time bombs) are cheap and compact for the capacity so they are used for the energy storage required in single phase systems, but if designing a three phase power supply less storage is required and maybe manufacturers wouldn't have to optimize the smaller, capacitors with electrolytics. I've had a 4k TV and a rather expensive musical instrument amp fail for capacitors this year, about $3000 of gear there. Looking back further there are a couple of professional powered sound reinforcement speakers (twice), an iMac G5 (three times). It's probably safe to say over the decades I've had $10k of electronics fail prematurely from electrolytic capacitor failure. These are all items which were not obsolete or worn out. Scale that up from one person to worldwide and it's easily in the hundreds of billions (10⁹) of dollars which could be saved.

The problem is you can't get there from where we are. No one wants to drag more wires to residential houses, no one wants to shell out >$1k for a household phase converter and then rewire their house. You won't convince people to add three phase to a new house because it might be the future. And you won't convince the semiconductor houses to make a line of three phase power supply controller chips just because, well, maybe someone might use it.

We're probably trapped at a suboptimal local maximum.

Edit: In response to atoav's comment I found IEC 60309 and its extra low voltage variant which is a 42mm (1 2.3") 3 phase plug and connector in two compatible sizes good for 1.3kW and 2.6kW. Also, that Germans and Swedes did the engineering first and then built an electrical system and they get 480V 3 phase at 16A to their cooktops and stoves through a plug for a whopping 13kW!

I am not exactly sure what you imagine, but I (in Germany) have two 3 phase outlets (3x 16A @230V) in my flat and this is very common here. We use it for cook tops and ovens in the kitchen typically, and it makes the difference between 15 mins for boiling water (old single phase cook top) and (on my induction cook top) 1 min.

Edit: and as an aside, why we probably don't do that everywhere: it makes for bigger plugs, and keeping safety standards can be a bit more expensive and space intensive with the potential 400V between phases.

I think Germany is one of the few countries which bothers to wire up three phase power to ordinary houses, not sure why.

It's quite common in the Nordic countries as well.

Well that's news to me! I'm jealous. I'll bet you can preheat an electric oven in less than 20 minutes.

I see the successor to the 16A Perilex connectors you are referencing is IEC 60309 which has an "extra low voltage" version with a smaller, 42mm diameter (1 2/3") plug good for 16A or 32A. 16A devices will fit in a 32A socket, but not the other way around. So that seems to be a 1.3kW and 2.6kW solution. Sign me up!

"So that seems to be a 1.3kW and 2.6kW solution"

Germany is 230v so that would be more like 3kw and 6kw.

I'm in the UK (also 230v) and a standard outlet can do 3kw, I think fuseboard fuses are 32amp so I don't know if 6kw is a practical upper bound though (edit: On 2nd thoughts I'm not even sure this is relevant to 3 phase).

Pedantic: Standard outlets are 16A in mainland Europe, so 3.7kW.

Modern single ovens can also be connected to a standard single phase outlet rather than a dedicated circuit. In my kitchen the induction cooktop is on a 16A three phase circuit, and all the other outlets (fridge/freezer, oven, dishwasher and appliances) are on a 16A single phase circuit.

5:11 from 20°C to 220°C

Wouldn't pedantic be 3.68kw?

I've never actually come across anything higher than 3kw, also a 16amp fuse isnt that precise, you could probably get up to 20 amps without it failing. The hotter you run it the quicker it fails though, so if you're regularly running 3.7kw through a 16 amp fuse, you may well find it failing, not through an underlying fault of anything plugged in to it.

>I think fuseboard fuses are 32amp

Most cables are 16A, so the outlets are 16A as well. Also UK uses 240V, not 230V. 32A circuit breakers would make the cables burn when overloaded.

Flex cables might be 16amp, individual sockets are 16 amp though, so if you have more than one of those on a ring main...

"Use a 30amp fuse or 32amp MCB for larger radial circuits, ring circuits up to 100m, cookers and electric showers up to 7200W"


normal fuses (in the fusebox) still have to be 16A (B or C class), the main fuse can be anything 32-50A or even tri-phase 50A.

>ring circuits up to 100m, cookers and electric showers up to 7200W

Normally those tend to be 3phases.

Of course it's possible to get over 16A on a single phase and a circuit breaker but that's quite a non-standard option. The issue is that if the fuse/circuit breaker is rated for 32A in a standard application (i.e. an outlet) the cables are likely to burn 1st.

I think we might be getting our wires crossed :)

I'm talking about the UK here, which I understood to be similar to the rest of Europe(?).

That's a UK website and 3 phase isn't common here. I've just double checked my fuse box, and I have a 32 amp fuse for my socket ring main.

Don't forget for a ring main the current is spread over 2 wires, so if theres 32amps power draw, theres only 16amp on each side of the ring.

UK here:

We don't have three phase (well its not common) but we can heat an oven up to 220c(420f) in 7 minutes. I think its a 3kw oven. its on a 32 amp (isolated) breaker.

I've never ever seen it in the UK, but I did live in a house with a flat on a separate meter, that as it turned out was on a separate phase. I only discovered this because there was a fault on the line for only one phase, and only after checking and changing every fuse (wire), and unplugging every electrical item. Grr.

Had something to toast so I tested mine too (3x 16A @230V). Took 4:52 to heat up to 220°C

I'm in Romania and it heats to 180C in about 3 and a half minutes.

Do we need to invent anything? My understanding is that most of the grid in the US operates on 3 phase, and some buildings (including residential) already have a 3 phase supply to operate AC and industrial applications.

All we are missing is for it to be standard to wire the 3 phase distribution throughout buildings to every outlet (preferably in a backwards compatible manner; in theory converting 3-phase to single phase is just a matter of ignoring a wire. I would assume that a) the grid can handle people doing this and b) doing this with out current 3-phase standard would result in our current single phase standard,

I hate those 3 phase residential buildings, because they usually deliver 208V to ovens and dryers instead of 240V.

While the appliances are probably approved fo 208V, I doubt they’re drawing more peak current to make up for it.

Wouldn't it be easier to mandate separated replaceable power supplies vs changing the electricity standard in buildings world wide?

Or make power supplies some sort of standard, so you can use your own expensive capacitor power supply instead? Or just shop for things that have separated power supplies?

Dc to dc voltage voltage conversion is easy kind of, you simply need to convert it to AC and back.

So as its going to become AC at some point anyway, why not use the existing wiring.

I've seen (owned a house having) dual-fuel natural gas + electric light fixtures, mounted to walls, gas pipes to every room right alongside the (knob & tube) electrical wires. So we've done weirder shit in the past to deliver two types of power throughout houses.

And no, I never tried lighting one. Hopefully none of them had had an active gas supply in decades.

For the devices that run off of low voltage DC today, lots of other commenters make good points about needing to regulate the voltage anyway, and I'm inclined to agree. Those devices generally use very little power anyway, so it probably isn't worth running a second set of wires because copper is expensive.

On the other hand, the biggest power users in a house (heater, air conditioner, dryer, stove, refrigerator) seem like they are a different story. In a future home with solar power and a high voltage DC battery pack, I think there is opportunity to make appliances that operate from the raw battery voltage, which is in the 300-500v DC range. That way, there isn't any conversion required at all, and those applications should be able to handle slight variations in voltage as the battery discharges.

Any high voltage DC distribution system would require every load to have a single run, with intelligent connectors paired with remote switches to shut off the circuit at detection of any detected alteration of the circuit (eg. Ground fault, capacitance change, etc.)

High voltage DC is spectacularly more dangerous than high voltage AC.

Could you give more details of this danger?

An arc on AC is more likely to be extinguished since the current drops to zero many times every second (so eg, on 50 Hz AC, an arc is not likely to last more than 1/100 of a second). DC switches and circuit breakers are designed with greater concern for arcs.

I wish more devices would run on PoE. It's DC, it's easy to install, it carries power and data. There are some (but few) PoE light fixtures, but not as many as I'd like.

That does make sense, especially for IoT devices (literally anything low power). However, I haven't seen PoE used outside of business use. The problem is that Wifi is becoming ever more prevalent (on my network I have 12 devices using wifi, and only my desktop using an ethernet cable - ironically the only device that isn't suitable for PoE).

I use PoE UniFi gear, because running a single Ethernet cable to power e.g. a wifi AP gives me a lot more flexibility in placing it cleanly than having to worry about power; that's for residential use.

Given the degree of fire risk in houses that's associated with 230V/10A wiring and devices, I do wonder how much safer it would be to have fewer AC/DC transformers scattered around the house with DC wiring.

If you're using DC to transmit the same amount of power, you'll probably be using similar voltages (otherwise you're going to need thicker cables). I suspect DC is probably a greater fire risk in this case due to arcing. An arc is less likely to persist on AC since the potential keeps passing through 0 V (hence DC switches are designed to have a bigger air gap).

If it's just a matter of providing alternative circuitry for low-power devices, presumably you can already improve safety by just using lower-current circuit breakers.

> If you're using DC to transmit the same amount of power, you'll probably be using similar voltages (otherwise you're going to need thicker cables).

A good point - I guess I'm mostly thinking about the volume of things in the house that don't really need access to even a fraction of the 2.4kW available; having moved from incandescent bulbs to LEDs, I've got a bunch of unnecessary power supply and per-bulb transformers, for example.

This is a thing!

The ISS runs 120v DC for this reason...higher efficiency and much less complex since you don’t need inverters and and power supplies.

My burning man camp of 65 people also only has a DC grid that runs on 24v

30kwh daily solar harvest

It’s a thing of beauty :)

Converting DC voltage is unlike AC with transformers not easy or practical.

Very few AC-powered devices even use traditional transformers anymore. Mostly they have some kind of switch-mode power supply which immediately rectifies the AC to DC. (Better and higher-power supplies have complicated active power factor correction circuits to do this.) Then they convert that high-voltage DC to low-voltage DC.

> I wonder if there's any future to running both an AC and a DC bus through the house

I used to have a house built around 1950 which had combination 117/220AC and 25VDC wired throughout the house. The electrical box was sectioned and the 25V had a large transformer. The house had low voltage DC ceiling fixtures that were tricky to source, and everything in the house that was switch controlled was on relays that allowed multi-point control, smart timers, and various other functions. It was a chore to get replacement parts but fortunately the attic had gangways and such so replacing blown relays was not horrific once you had the replacement parts. It was perfect for upgrading to smart control which I did for about half the house before selling it. Here's the thing. Tons of houses back then were wired this way. It was the wiring of the future and they explicitly realized in the 50s that someday they would be able to have computers in the home, perhaps as small as a single room, control everything. So instead of an electrical closet there was an electrical room with enough space for a refrigerator sized computer should such a thing ever come to market. 1950s.

Conk has a good response but you also have to realize that it's still too expensive to setup a battery as well. So, since the grid will be needed, you'll need conversion to be able to handle the fluctuating demand your home requires. And don't install new devices, as that can change your equation but connecting back to the grid allows much needed simplicity.

Other companies are working on this (energy sovereignty) issue, my guess it (stand alone affordable systems) will be solved within the next twenty years and be affordable. DC use might be apart of it but I'm doubtful since there is a level of complexity that each device is using different amps/volts/etc that will take new products to solve and the independent grid system still takes a lot of capital.

New (>2005) AC/DC converters (switching PSU), have such a high efficiency than I would think it's not worth the hassle

Pretty much all switching power supplies would run off DC as well - from 110V - 350V DC.

Is there any safe way to test this for a server PSU? As in "unlikely to blow the PSU", potentially requiring slight pre-checking by looking at the circuit?

Yes, a noop and declaring the "test" successful.

I would be very surprised if you could find any non-ancient (and I really mean ancient, like, pre-PC, probably even pre-home computers) computer PSU that you couldn't feed with DC. What voltages will work will vary, but as long as you stay below the peak voltage corresponding to the effective voltage printed on the label, you probably won't damage anything.

If you really want to test it, you could measure the resistance at the mains input, if that gives you a non-infinite value that stays at that value (rather than increasing towards infinity), you are dealing with a mains transformer. But you really won't find that in a computer PSU.

~240V fully rectified to DC is 340V. Since AC varies it's safe to say virtually any PSU that's not exclusively Japan or North America will take 350V DC.


It's a hot-swappable 300-1897 (aka X6328A aka Type A217) 1050W Sun PSU. I'm primarily worried about the active PFC I expect in it.

Is there a way to test such a PSU without risking the entire server? (save for taking everything out that isn't needed to post into BIOS)

Well, you probably can somehow power it up without a server connected to it?

Active PFC tends to be simply a boost converter at the input stage that boosts the (rectified but still unfiltered) input voltage to (DC) 350 V or more, modulating the current to follow the mains voltage and should generally have no problem with DC input, especially so with wide-range inputs that have to be able to deal with different voltage levels anyway.

>modulating the current to follow the mains voltage and should generally have no problem with DC input

I'm particularily worried about confusing that following/regulating circuity. If the 50/60Hz it senses are caused by screen refresh feeding back somehow, it might end up behaving "weird".

But I guess I'll just try, and hope it won't break anything.

If it weights 5KG+ (12pounds), likely it is not a switching power supply as it has a bulky transformer (I can't imagine a server PSU not being a switching one). Other than that laptops, phone chargers, etc. would work off DC. Most of them use a small transformer and high frequency (above hearing range)to drop the voltage.

The transition is something like: 90-240V AC (50-60Hz) -> DC (1.414*AC) -> AC high frequency ->(transformer) AC low voltage -> DC 12V

Very cheap electronics that use capacitor+resistor voltage drop need AC.


To answer the question directly, use a small fuse 1-3A and run the PSU idle. Like advised by sibling, measuring the resistance would be sufficient as well.

Looked around a bit, and the link below says DC has a lot of problems. One is that you's still have to convert your home DC voltage to whatever voltage each DC device uses.


Realistically 25-30A with thick copper wire are doable. 20V is a good ground but even then it could power a TV + couple laptops, perhaps lights too. That would be a solution to have dedicate DC lane in-house but the conversion to low voltage DC must have in-house (office buildings would require tons of copper). Galvanic corrosion might be an issue to a certain degree.

Carrying high voltage DC, comes with its own issues - likely doable though. Many household devices would run perfectly fine on high voltage DC (instead of AC).

Honestly, just use aluminium cables. Those only need proper crimping, which isn't too expensive.

The cost for the cables themselves would be lower than copper AC, even.

A number of off grid cabin setups use DC distribution within the house and purely DC appliances (fridge, LED lighting, etc). Usually in the context of places that are very far from a grid, with wood heat, or tiny home size off grid setups.

Think of it something like planning the DC loads on a 55 ft sailboat.

People who do this will usually have one small 300 to 500W sized DC to ac pure sinewave inverter for laptops and chargers of small consumer electronics.

It’s happening even more than you think.

Your fridge is probably running “variable frequency drive”, so instead of being off and on at full blast, it just runs the compressor at 20hz or 30hz or whatever it needs continuously, and dialing that up or down depending on demand.

Your air conditioner too.

For bigger customers, this saves them a fortune in demand charges.

For others, it prolongs life by vastly reducing stop/starts.

Most things that use DC use pretty low voltage, like 5 or 12 volts or so. High voltage allows you to transfer energy much more efficiently. So, for low power devices it might make sense to run a low voltage line but you wouldn't be able to run high-power devices out of it unless you wire your house with really thick wiring.

Also a layperson. How many different specs are running around the house, and what sorts of electrical variations can they withstand? I think you'd need to have a house-wide AC-DC converter to feed in energy when your solar supply drops down. Or a battery storage system probably makes more sense.

It's actually a good intuition. See: https://solar.lowtechmagazine.com/2016/04/slow-electricity-t...

tldr: it would make sense for an office building but actually not that much in a residential one.

Many of your AC devices are actually DC after transformer.

You could do power over ethernet to get connectivity too.

More than 50% of the Population will live inside Mega Cities by 2050 [1]. While it is good to have Solar cheaper for Houses, the reality is most people won't have a rooftop Solar option.

Edit: This is not to say 50% cheaper is useless, it is a point of information for people living in houses and not realise large population around the world cant afford a house.

[1] https://www.cnbc.com/2018/05/17/two-thirds-of-global-populat...

I hope that telecommuting will prevent this. The same amount of money can net you a much higher standard of living in a rural area with less crime and pollution and an all around better place to raise children.

Having half the world converge on urban centers scrambling to pay exorbitant rent and taxes for the privilege of 100sq feet of personal space while sharing an apartment with five strangers is misguided at best and dystopian at worst.

This thinking is misguided. You can house people way more economically in high density high-rises: You take the lift to the doctor and the store and the school. Internet, water, waste and food transport costs are minimised. You do not need kilometres of asphalt to connect people, power and water distribution is way more cost effective.

I see these small families in the new developments I sometimes drive through, there is maybe a meter of sand or grey gras between each house, each house has its own roof, water heater, windows looking into the fence 1m away. All of this has to be maintained. These people would have been better of living in a apartment in a high rise.

> living in a rural area with less crime

I doubt “rural” will continue to be “low crime”.

In our fully automated future I expect criminals will also be telecommuting — stealing, vandalising, or trading illegal substances with drones, drones which are quite possibly either bio-mimicking or even bio-printed so as to make it less appealing to shoot them.

That is a very far fetched idea.

Why? Drones are already used for drug delivery, bioprinting is a thing albeit not an amazingly good thing yet, and biomimicry is demonstrated in occasional TED talks and military spy drone demonstration videos on YouTube.

There are so many weird angles to it that I just don't know where to start. In a sentence, it's just a big leap to go from a couple of disjointed currently-niche technologies to the conclusion that life in rural areas is unlikely to be as safe in the future.

I didn’t say “safe”, I said “low crime”. Overall safety might increase, but I expect crime rates to go up everywhere as tech enables it. At least in the next 10-20 years, beyond that is unpredictable change to too many aspects of our world.

There will be countermeasures as well. Whether a bird is fake or real, breaking into a house can still be detected.

I imagine we'll also see small robots that scurry into your garage when you open the door and whose one job will be to open the door later for thieves. That is probably already possible. So we'll need intelligent security cameras in garages too.

Drones that crash through a window and set the place on fire - I don't know what we'll do about that one. But anticipating and solving such problems would make an interesting startup.

Drones that crash through a window and set the place on fire - I don't know what we'll do about that one.

There are various window films that can do an excellent job of making it quite hard to break a window. But they're currently quite expensive. E.g,: https://www.3m.com/3M/en_US/home-window-solutions-us/ (sorry for previous bad link, 3M website bozos make it hard to get a sane URL)

I live in a "stick house" and it would be quite easy to set it on fire from the outside. Cedar siding, shrubs near the walls, etc. No need to crash through a window.

Edit: I forgot to mention a roof made of "kindling". It wore out and I replaced it with a composition roof, but in Oregon it's trendy to have a "shake" roof. If you've ever examined a shake you will know it's just a very dry piece of wood (at least in summer, in winter it's very wet).

Fortunately we don't live in such a dystopian world ... yet, at least not in most parts of the USA.

> If you've ever examined a shake you will know it's just a very dry piece of wood (at least in summer, in winter it's very wet).

Then there’s the varnishes and oils that you apply to repel water, which just makes it more flammable.

I hope not. The climate externalities are huge. While city dwellers minimize water use, carbon use, and electricity use, rural dwellers maximize these. Perhaps we can ensure that people pay for their carbon.

Do you have any sources? I've found one article, but it was actually arguing that the difference isn't actually that large. Would be open to more recent and other studies thoug.


There's a map here of per-capita carbon footprint per zip code: https://coolclimate.org/maps

It seems to suggest that suburbs are generally the biggest emitters per-capita. Inner cities tend to be fairly low and some rural areas are low too, though that varies (rural areas in the midwest seem to be high).

If more people moved to rural areas, my guess is they'd start looking more like the suburban areas in energy usage, though.

Cool thanks. One interesting thing I noticed as a midwest town dweller, is that while my zip code rates quite low in population density (up there with houston) it's actually in the green as far as carbon goes. It seems that this is true of many of the smallish towns in the midwest (population < 500k). In general suburbs of large cities use vastly more carbon than towns with the same population density, not really that surprising.

I guess the question then is it easier to have more small towns or to have high density cities without surrounding sprawl.

It'd be interesting to compare US settlements with many of their non-US counterparts.

Hm. Okay it's quite surprising that Manhattan is green. Maybe billionaires' row doesn't have it's own zip code.

Also it's a US only map. It would be interesting to compare with cities like Amsterdam, London, Shanghai, Hong Kong, etc. And of course the corresponding rural areas and small cities too.

Having the lowest car-ownership rate in the US probably helps Manhattan's overall numbers. :-) Per-capita square footage (which directly impacts heating/cooling energy usage) is also fairly low by suburban standards, even in the rich parts of Manhattan. There are some big apartments on the Upper East Side, but most of them are still small compared to what you find in Westchester.

I would definitely be interested in a more international version, too.

Eh. I strongly suspect that cheap, quick, self-driving (solar-sourced) electric cars and VR + seamless telecommuting is going to nip urbanization in the bud and send people back to the burbs.

Maybe 20-30 YO college graduates will do a tour of duty in SF/NY for fun, but nobody in their 30s+ is going want to go to bed listening to their upstairs neighbors party, their downstairs neighbors screw, and the homeless schizophrenic outside systematically shatter glass bottles on the sidewalk.

You can live in the city without experiencing any of those things though.

Correct, if you are rich enough to own a penthouse suite, you will not have upstairs neighbors.

Or if you're rich enough live in a house with modern sound insulation, which should be affordable to most middle class families. I almost never hear my neighbors and the house I live in was constructed in the early fifties. Newer buildings have much better sound insulation.

I'm not sure this means much. LA is considered a mega city. All it means is lots of people. It says nothing about density

Passive solar for air and water temperatures is the way to design it into new homes. Cool in summer, warm in winter. Few to no moving parts or electricity needed. Little to ever replace. Anathema though to fancy solutions favored by VCs like photoelectric utilizing the strip-mining of millions of hectares of arable land, followed by toxic run-off, to produce panels and batteries.

For retrofits of existing homes, geothermal whenever possible for upgrading central climate control when present. Current professional installers though are generally incompetent and price-gouge. Sufficiently motivated hackers can easily DIY.

Passive design can be climate specific, but I agree, it needs to be built into new designs. The problem that I'm seeing is that "no one" cares: developers will build houses with gorgeous kitchens/bedrooms/living rooms, but the HVAC stem will be bottom-shelf equipment with flex duct run all over the place (and pinched in a few other places). Because no one sees it, nor thinks to question it. We really need to figure out how to make it desirable and something people demand. Maybe that's raising electrical rates or taxing antiquated system design? Some form of hitting a wallet?

I like what you're thinking about geothermal; it sparked a few thoughts, though: Geothermal is expensive because of labor, not necessarily price gouging. Most installers (in my area) put a 10% mark up on everything: finding competent installers is harder and harder these days, so you're paying more and more. HVAC techs can make six figures if they have the experience and aren't afraid to negotiate. Unless, of course, they ARE price gouging, which happens, especially if they're already busy (which has been the case for the past three years).

The technology is solid, though to be frank, a high efficiency air-to-air heat pump will be comparable to geothermal, depending on the climate zone. If we can figure out a cost-effective way to replace gas-fired heat in northern climates with something more eco friendly (and consumer friendly), progress would be made.

Charging higher prices when demand is high is not gouging.

If people are making 200k At faang what’s wrong with tradespeople making 100k when their skills are also in demand.

Everybody who supplies or resells equipment puts a mark up on it because there is some cost for them to select it, procure it, store it, transport it, finance it, etc.

In construction price gouging is often less about the rate than less ethical charges.

"Geothermal is expensive because of labor, not necessarily price gouging."

In my experience, this is true. Comparing the electric heat system from one HVAC contractor to the ground source system I went with, there are a few expenses that are unique to ground source that start to add up. (USA - 2016)

$3,000 for the man and machine to dig the trenches and fill them back in.

Unknown - 2 days with a handful of laborers and 2,400 feet of pipe to lay and fuse together.

1/2 day to fill and purge the air from the pipes.

The equipment is more for sure, but I don't know an exact figure, but multiple thousands.

Overall, the 2 bids were about $15,000 different. Tax credits and even a rebate from the utility co-op made it close enough to a wash that I went with ground source and have been very happy.

When people ask about my monthly bill, the next question is "Why doesn't everyone do that?" I think many people don't care enough to deal with the sticker shock of a more expensive system. Another big reason is the amount of land required for horizontal trenching (there are other options, but this is generally the least expensive).

I also agree that if your climate is mild enough to use an air source pump, go for it. They don't quite cut it where I'm at (USA, just shy of Canada).

> The problem that I'm seeing is that "no one" cares... Because no one sees it, nor thinks to question it.

Put another way: the cost of operations isn't factored into the sales price. The builders don't have exposure to how much the buyer will pay to heat the home, so there's no reason to do extra work to insulate or seal better.

There are starting to be home efficiency standards that are being considered in the real estate markets (things like LEED certification for homes), but you're right, people are more likely to buy based on the look of the kitchen than the winter heating bill.

I've been trying to think if there's a financialization play possible... something like monetizing the downstream efficiency savings to incentivize the upfront work. The solar PPA model has been the solar company's monthly bill becomes your new electricity bill, and (ideally) paying off the solar installation monthly over 20 years was still cheaper than paying for non-solar electricity during that time. I've been wondering if something similar can be done for home efficiency upgrades.

> The problem that I'm seeing is that "no one" cares

It even more fundamental: no one knows about passive buildings. With homes, it's like we are still in the 50s with gas guzzling cars. That required a major oil shock to force a shift in fuel economy. No such impetus yet for homes, but when it comes, hopefully we have figured out a way to retrofit our housing stock cost effectively.

At least in the EU the "no one cares" is changing. Member states need to have legislation in place that means from 2020 all new builds are "nearly carbon zero" (unfortunately left to interpretation by country). In my country that means homes have insulation and heating requirements on par with Passive House standards, as well as the majority of their energy coming from renewable sources.

Solar hot water heating works really well in the northern US. It’s an odd tradeoff as you get less energy so you need to scale up, but you also use the system more days out of the year. You do want a large thermal mass which can be difficult to add to an existing home.

THe biggest thing you can do is properly insulate.

100-200mm of insulation on the outside of the house/loft/underfloor and triple glazing

The biggest "passive" design from my understanding is a few deciduous trees. In the summer, the leaves block out the sun, but in the winter, the leaves fall off and the sun directly heats the home.

Bigger factors in passive house design are siting and orientation, building envelope air sealing, continuous insulation, and correct window placement and shading - usually with purpose designed shade structures - not trees.

It's not that trees can't provide the shade, but they take a while to grow enough to achieve the immediate effect of an awning. If you have a mature tree on a site, however, you could design a house that takes advantage of the shade it already provides.

That's fine when you live in a house. Not so relevant when you live 20 storeys up. I feel like way too much time and effort goes into improving single family houses, as opposed to larger multi-family buildings. They could be built to be much more efficient and just as comfortable. But alas.

One problem is that fire prevention and safety has trumped energy efficiency.

Can I open my window more than 4” no, someone could fall out.

Can we prop our doors open to get a draft? No.

The reduced external surface area to volume ratio really improves efficiency though.

By geothermal you mean a ground-source heat pump? What is the bare minimum cost of the materials & digging & installation?

It’s always a lot. Can be kinda low if you’re allowed to do open loop water and a high water table but they’re often banned.

Passive homes are great, but they aren't mutually exclusive with PV. In fact, passive homes typically use PV for powering systems that require electricity to run.

Why is it so expensive to install solar in the US? In the UK you can get a 3kw grid tied system installed for under £5000 ($6100), and around double for a 9kw system - a lot of the cost is renting scaffolding and having people on site to do the work, so it gets cheaper per kw as you have more installed.

This is my thought, looking at the graph, a good dollar per watt appears to come from getting permits and such.

I'm not aware of having to notify the local council or power company about installing solar (you do have to get certification before you can receive money for generation though, but I don't think that costs significant money, its more a case of proving provenance of solar panels and equipment. )

This is the key to this competition-winning business model: more than half the cost of a resi solar system in the US is in the installation and the design/permitting.

Permitting involves things like making sure the design is up to fire codes (eg there are setback requirements to make sure a firefighter can put a ladder on the roof) and the interconnection work with the utility that lets you send energy back into the grid. In the worst states, these extra costs are largely artificial barriers (ie outright anti-competitive tactics).

I believe Germany for example spent a lot of work streamlining these all these permitting and interconnection fees, and as a result has much lower solar costs.

What this company appeared to have proposed in this contest is that by bolting solar onto pre-fab homes in the factory:

1. Installation costs drop, presumably because it's built into the design of the home and there's factory tooling around it instead of a retrofit system; and

2. Permitting costs go down because the design is standardized and presumably you can get economies-of-scale if you're connecting a whole subdivision or trailer park all at once.

This also nicely addresses the equity angle since prefab homes are generally aimed at lower income levels -- the claim here is that this building can drop the price of solar by almost half for this demographic.

The biggest cost saving in the UK (at least) is the cost of scaffolding. its about £900 to get it up. which is 1/5th of the total install in some cases.

I have a quote to add 13kw system to my house with a flat roof for about $40k

Then I’d get 30% back from the government - and my state has no sales tax on this (which would have added 10%)

So, net cost to me of about $28k for a 13kw system

This does seem a bit pricey, but the tax rebate goes down a bit next year, so we will likely do it this year

In Australia, I think a 13kW install would be quoted at about US$10k or less. We put 5-6kW on at home, and 6-7kW at my small office building. Usually comes out to around $1/W.

Unless you live in a high cost area, sounds high to me. Quotes I have seen are below $3/watt.

* batteries not included - that’s an extra add on for about 50% more!

That should move the average cost in the US's favour - they use so much more electric in a typical home than we do. US solar installs are bigger than our average install of 3-4kW. Probably double if I had to guess.

Don't all UK installs over 6kW get counted as commercial still?

I got a quote recently (Bay Area, CA) that was about $15k for 7.7kw. So not too far off from your 9kw quote.

in Brazil I got 5kw grid-tied for about $5000. No batteries, no subsized, and have 5 years to use everything that I inject in the grid. But still have to pay a minimum of 100kwh each month just for the avalability of the grid.

American businesses optimize for wealth extraction. Prices are inflated to pay for extra homes, yachts, privatized inflated healthcare prices, etc.

Basically the emotional meme here is to wrap oneself in a reality distortion field fueled by avarice.

The way Americans rationalize this is saying that if a business is not insanely profitable, then it is not even worth starting. In tech this manifests itself as saying that you should only create "world changing" businesses, and if you have a so-called "lifestyle" business you're seen as a loser. The reality is that these insanely profitable business can only exist when you have a deep pocketed investor as a co-owner of the business, that is, a billionaire owning a large part of the business is a necessary part of the process.

I do think this is closer to the truth even though you're being downvoted. I worked in the industry for about a year and I'm working on starting up my own company now.

If you removed the 30% tax credit, it would turn $3 a watt into the UK's ~$2 a watt. I know it's not that simple but it's not far from the truth. Even Tesla is trying to launch a product at $1 watt for commercial/industry clients.

For all the talk of AC v DC, that debate is moot. Current on-grid solar tech integrates a spec inverter into each panel. In a net metering install (no batteries, generators etc) this is far far simpler. No loads to calibrate, no DC copper to run, no inverters to break, just plug the panel into the mains and see your bill drop. That is the way forward for urban/suburban solar power.

not entirely. It will also cause brownouts at scale, as there is no secure scalable way to stop production on peak solar days (save for shutting off the supply, which kills the inverters)

solar + battery is a far better solution as it can soak up peak generation _and_ peak demand.

Once solar installations reach a scale where that becomes a problem, I'm sure you'll find enterprising people who find ways to suck up extra power when there is a surplus.

>That is the way forward for urban/suburban solar power.

No way. The future of residential solar is coupled with storage. And the future of storage is in DC coupling, not wastefully inverting solar to AC just to invert back to put it in the battery.

Tesla Powerwalls support either AC or DC coupling, depending on system architecture (grid tie, inverter type, etc). It's unlikely homes will be designed for DC powered loads considering the copper costs (versus conversion losses between current types).

TIL, never seen a DC-coupled PW2 in the field. Looks like it was killed off in 2017. (i.e. before PW2 was widely available)



Here's a good video on the problems of transmitting power, and why we need such high voltages.

How do Electric Transmission Lines Work? https://www.youtube.com/watch?v=qjY31x0m3d8

So why are the pictures on the site clearly of retrofits of solar panels to an existing tile roof?

It’s a start up and just started working with manufactured housing builders.

Corporate press releases get voted up to #2 on HN these days? Unfortunate.

It’s a Sunday. Weekend HN is a free-for-all. I kind of like it this way.

A glimmer of hope for clean energy, before it is too late... "I want to believe"

Yeah. Only on the weekend does "buy a new house to get solar" make sense. And no one seems to be phased that the DOE is the one pushing it.

Combining the roof with the solar panels seems like a win. A very large percentage of the cost of solar installation (50% last I checked) was labor. So when it's time to redo the roof on a house, you can also install solar panels. The interesting question is whether the panel lifecycle is the same as the roof lifecycle.

Regardless of the long term trends toward populations moving toward the city, I don't see why most newly built suburban homes after 2025 wouldn't come with a Tesla (or competitor) solar roof.

The 20-50k that it would cost would just get absorbed into the mortgage and become an asset that would increase the value of the home - besides that, over the over a 30 year period, if you are able to capture even half that value in electricity savings, it becomes a no brainer as it would cost what a normal roof costs anyway.

There are also the benefits of more efficient energy transmission and resiliency of micro-grids as just bonuses.

It's very annoying when the cost of something advanced comes from very trivial legacy issues like that.

I wish someone develops cheap adapters while construction code gets update to help panel frames.

It's always cheaper to design a feature into a product than to add it later.

Wish I saw more CNC style designed houses :(

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact