I wonder if there's any future to running both an AC and a DC bus through the house, or if that makes no sense for reasons beyond my basic understanding.
He still has all the original 110 wiring and fixtures in the house, but this battery-backed solar system runs all his lights. It can easily go 2 days without sun before he needs to use the regular lights. Another benefit is that DC LED bulbs are by definition flicker-free, and also low consumption (4-6 watts each) but bright. He calculated he can light each room better than before with 6-8 bulbs, so 40-50 watts, thus only 4-5 Amps (@ 12 VDC) to each room.
All the big appliances in the house still run off of of AC, but he does have a 110 inverter so that he could run a critical appliance such as a fridge during an extended outage.
So this little system replaces 1-2 kWh of incandescent lighting per day (CFC or LEDs on 110 would be less, but not flicker-free), while providing lights during outages. He says it's not cost-effective (the usage displaced by solar and battery usage doesn't cover the cost of the system), but the backup powers is worth it to him.
I guess you could pick whatever voltage you want for your loads and cable runs, but you need to have the appliances you want available (or easily DIY) in that voltage. It would probably be possible to convert down the voltage once, but any more than that and you're introducing loss again.
In other words, a 120VAC LED should work just fine on 48VDC with the same output.
It’s also explains why they don’t dim.
(Obviously avoid the dimmable ones in this application!)
It is putting more current through each light.
And particularly problematic if each bulb has parallel LEDs and they start burning out: then more and more current is put through the remaining LEDs.
You still need PWM to control brightness, no? I'd you're using resistors efficiency goes down a lot.
12 volt wiring in homes isn't terribly efficient as once you get past 6-10 feet you need really expensive (heavy, thick) wires to not have much voltage drop. Most new boats use 24 or 48v for long haul and then local step down to 12v where needed (kitchen, living, sleeping areas etc).
If you transmit the same number of watts at a higher voltage (lower current) rather than at a lower voltage (higher current) through a wire, you will lose less power.
So for moving the power around the house at distances more than around 10 feet (~3 meters), it's more energy-efficient to convert it near the point of use. And if you're already going to convert it, it's probably not worth the expense and complication of running two separate wiring systems.
i think the difference you're looking for is in volume of copper.
The big savings would be in longer lived electronic devices. A sizable portion of digital electronics ultimately fail because the capacitors in their power supplies exceed their designed lifetime. Electrolytic capacitors (ticking time bombs) are cheap and compact for the capacity so they are used for the energy storage required in single phase systems, but if designing a three phase power supply less storage is required and maybe manufacturers wouldn't have to optimize the smaller, capacitors with electrolytics. I've had a 4k TV and a rather expensive musical instrument amp fail for capacitors this year, about $3000 of gear there. Looking back further there are a couple of professional powered sound reinforcement speakers (twice), an iMac G5 (three times). It's probably safe to say over the decades I've had $10k of electronics fail prematurely from electrolytic capacitor failure. These are all items which were not obsolete or worn out. Scale that up from one person to worldwide and it's easily in the hundreds of billions (10⁹) of dollars which could be saved.
The problem is you can't get there from where we are. No one wants to drag more wires to residential houses, no one wants to shell out >$1k for a household phase converter and then rewire their house. You won't convince people to add three phase to a new house because it might be the future. And you won't convince the semiconductor houses to make a line of three phase power supply controller chips just because, well, maybe someone might use it.
We're probably trapped at a suboptimal local maximum.
Edit: In response to atoav's comment I found IEC 60309 and its extra low voltage variant which is a 42mm (1 2.3") 3 phase plug and connector in two compatible sizes good for 1.3kW and 2.6kW. Also, that Germans and Swedes did the engineering first and then built an electrical system and they get 480V 3 phase at 16A to their cooktops and stoves through a plug for a whopping 13kW!
Edit: and as an aside, why we probably don't do that everywhere: it makes for bigger plugs, and keeping safety standards can be a bit more expensive and space intensive with the potential 400V between phases.
I see the successor to the 16A Perilex connectors you are referencing is IEC 60309 which has an "extra low voltage" version with a smaller, 42mm diameter (1 2/3") plug good for 16A or 32A. 16A devices will fit in a 32A socket, but not the other way around. So that seems to be a 1.3kW and 2.6kW solution. Sign me up!
Germany is 230v so that would be more like 3kw and 6kw.
I'm in the UK (also 230v) and a standard outlet can do 3kw, I think fuseboard fuses are 32amp so I don't know if 6kw is a practical upper bound though (edit: On 2nd thoughts I'm not even sure this is relevant to 3 phase).
Modern single ovens can also be connected to a standard single phase outlet rather than a dedicated circuit. In my kitchen the induction cooktop is on a 16A three phase circuit, and all the other outlets (fridge/freezer, oven, dishwasher and appliances) are on a 16A single phase circuit.
5:11 from 20°C to 220°C
I've never actually come across anything higher than 3kw, also a 16amp fuse isnt that precise, you could probably get up to 20 amps without it failing. The hotter you run it the quicker it fails though, so if you're regularly running 3.7kw through a 16 amp fuse, you may well find it failing, not through an underlying fault of anything plugged in to it.
Most cables are 16A, so the outlets are 16A as well. Also UK uses 240V, not 230V. 32A circuit breakers would make the cables burn when overloaded.
"Use a 30amp fuse or 32amp MCB for larger radial circuits, ring circuits up to 100m, cookers and electric showers up to 7200W"
>ring circuits up to 100m, cookers and electric showers up to 7200W
Normally those tend to be 3phases.
Of course it's possible to get over 16A on a single phase and a circuit breaker but that's quite a non-standard option. The issue is that if the fuse/circuit breaker is rated for 32A in a standard application (i.e. an outlet) the cables are likely to burn 1st.
I'm talking about the UK here, which I understood to be similar to the rest of Europe(?).
That's a UK website and 3 phase isn't common here. I've just double checked my fuse box, and I have a 32 amp fuse for my socket ring main.
Don't forget for a ring main the current is spread over 2 wires, so if theres 32amps power draw, theres only 16amp on each side of the ring.
We don't have three phase (well its not common) but we can heat an oven up to 220c(420f) in 7 minutes. I think its a 3kw oven. its on a 32 amp (isolated) breaker.
All we are missing is for it to be standard to wire the 3 phase distribution throughout buildings to every outlet (preferably in a backwards compatible manner; in theory converting 3-phase to single phase is just a matter of ignoring a wire. I would assume that a) the grid can handle people doing this and b) doing this with out current 3-phase standard would result in our current single phase standard,
While the appliances are probably approved fo 208V, I doubt they’re drawing more peak current to make up for it.
Or make power supplies some sort of standard, so you can use your own expensive capacitor power supply instead? Or just shop for things that have separated power supplies?
So as its going to become AC at some point anyway, why not use the existing wiring.
And no, I never tried lighting one. Hopefully none of them had had an active gas supply in decades.
On the other hand, the biggest power users in a house (heater, air conditioner, dryer, stove, refrigerator) seem like they are a different story. In a future home with solar power and a high voltage DC battery pack, I think there is opportunity to make appliances that operate from the raw battery voltage, which is in the 300-500v DC range. That way, there isn't any conversion required at all, and those applications should be able to handle slight variations in voltage as the battery discharges.
High voltage DC is spectacularly more dangerous than high voltage AC.
Given the degree of fire risk in houses that's associated with 230V/10A wiring and devices, I do wonder how much safer it would be to have fewer AC/DC transformers scattered around the house with DC wiring.
If it's just a matter of providing alternative circuitry for low-power devices, presumably you can already improve safety by just using lower-current circuit breakers.
A good point - I guess I'm mostly thinking about the volume of things in the house that don't really need access to even a fraction of the 2.4kW available; having moved from incandescent bulbs to LEDs, I've got a bunch of unnecessary power supply and per-bulb transformers, for example.
The ISS runs 120v DC for this reason...higher efficiency and much less complex since you don’t need inverters and and power supplies.
My burning man camp of 65 people also only has a DC grid that runs on 24v
30kwh daily solar harvest
It’s a thing of beauty :)
I used to have a house built around 1950 which had combination 117/220AC and 25VDC wired throughout the house. The electrical box was sectioned and the 25V had a large transformer. The house had low voltage DC ceiling fixtures that were tricky to source, and everything in the house that was switch controlled was on relays that allowed multi-point control, smart timers, and various other functions. It was a chore to get replacement parts but fortunately the attic had gangways and such so replacing blown relays was not horrific once you had the replacement parts. It was perfect for upgrading to smart control which I did for about half the house before selling it. Here's the thing. Tons of houses back then were wired this way. It was the wiring of the future and they explicitly realized in the 50s that someday they would be able to have computers in the home, perhaps as small as a single room, control everything. So instead of an electrical closet there was an electrical room with enough space for a refrigerator sized computer should such a thing ever come to market. 1950s.
Other companies are working on this (energy sovereignty) issue, my guess it (stand alone affordable systems) will be solved within the next twenty years and be affordable. DC use might be apart of it but I'm doubtful since there is a level of complexity that each device is using different amps/volts/etc that will take new products to solve and the independent grid system still takes a lot of capital.
I would be very surprised if you could find any non-ancient (and I really mean ancient, like, pre-PC, probably even pre-home computers) computer PSU that you couldn't feed with DC. What voltages will work will vary, but as long as you stay below the peak voltage corresponding to the effective voltage printed on the label, you probably won't damage anything.
If you really want to test it, you could measure the resistance at the mains input, if that gives you a non-infinite value that stays at that value (rather than increasing towards infinity), you are dealing with a mains transformer. But you really won't find that in a computer PSU.
It's a hot-swappable 300-1897 (aka X6328A aka Type A217) 1050W Sun PSU.
I'm primarily worried about the active PFC I expect in it.
Is there a way to test such a PSU without risking the entire server? (save for taking everything out that isn't needed to post into BIOS)
Active PFC tends to be simply a boost converter at the input stage that boosts the (rectified but still unfiltered) input voltage to (DC) 350 V or more, modulating the current to follow the mains voltage and should generally have no problem with DC input, especially so with wide-range inputs that have to be able to deal with different voltage levels anyway.
I'm particularily worried about confusing that following/regulating circuity. If the 50/60Hz it senses are caused by screen refresh feeding back somehow, it might end up behaving "weird".
But I guess I'll just try, and hope it won't break anything.
The transition is something like: 90-240V AC (50-60Hz) -> DC (1.414*AC) -> AC high frequency ->(transformer) AC low voltage -> DC 12V
Very cheap electronics that use capacitor+resistor voltage drop need AC.
To answer the question directly, use a small fuse 1-3A and run the PSU idle. Like advised by sibling, measuring the resistance would be sufficient as well.
Carrying high voltage DC, comes with its own issues - likely doable though. Many household devices would run perfectly fine on high voltage DC (instead of AC).
The cost for the cables themselves would be lower than copper AC, even.
Think of it something like planning the DC loads on a 55 ft sailboat.
People who do this will usually have one small 300 to 500W sized DC to ac pure sinewave inverter for laptops and chargers of small consumer electronics.
Your fridge is probably running “variable frequency drive”, so instead of being off and on at full blast, it just runs the compressor at 20hz or 30hz or whatever it needs continuously, and dialing that up or down depending on demand.
Your air conditioner too.
For bigger customers, this saves them a fortune in demand charges.
For others, it prolongs life by vastly reducing stop/starts.
tldr: it would make sense for an office building but actually not that much in a residential one.
Edit: This is not to say 50% cheaper is useless, it is a point of information for people living in houses and not realise large population around the world cant afford a house.
Having half the world converge on urban centers scrambling to pay exorbitant rent and taxes for the privilege of 100sq feet of personal space while sharing an apartment with five strangers is misguided at best and dystopian at worst.
I see these small families in the new developments I sometimes drive through, there is maybe a meter of sand or grey gras between each house, each house has its own roof, water heater, windows looking into the fence 1m away. All of this has to be maintained. These people would have been better of living in a apartment in a high rise.
I doubt “rural” will continue to be “low crime”.
In our fully automated future I expect criminals will also be telecommuting — stealing, vandalising, or trading illegal substances with drones, drones which are quite possibly either bio-mimicking or even bio-printed so as to make it less appealing to shoot them.
I imagine we'll also see small robots that scurry into your garage when you open the door and whose one job will be to open the door later for thieves. That is probably already possible. So we'll need intelligent security cameras in garages too.
Drones that crash through a window and set the place on fire - I don't know what we'll do about that one. But anticipating and solving such problems would make an interesting startup.
There are various window films that can do an excellent job of making it quite hard to break a window. But they're currently quite expensive. E.g,: https://www.3m.com/3M/en_US/home-window-solutions-us/ (sorry for previous bad link, 3M website bozos make it hard to get a sane URL)
I live in a "stick house" and it would be quite easy to set it on fire from the outside. Cedar siding, shrubs near the walls, etc. No need to crash through a window.
Edit: I forgot to mention a roof made of "kindling". It wore out and I replaced it with a composition roof, but in Oregon it's trendy to have a "shake" roof. If you've ever examined a shake you will know it's just a very dry piece of wood (at least in summer, in winter it's very wet).
Fortunately we don't live in such a dystopian world ... yet, at least not in most parts of the USA.
Then there’s the varnishes and oils that you apply to repel water, which just makes it more flammable.
It seems to suggest that suburbs are generally the biggest emitters per-capita. Inner cities tend to be fairly low and some rural areas are low too, though that varies (rural areas in the midwest seem to be high).
If more people moved to rural areas, my guess is they'd start looking more like the suburban areas in energy usage, though.
I guess the question then is it easier to have more small towns or to have high density cities without surrounding sprawl.
Also it's a US only map. It would be interesting to compare with cities like Amsterdam, London, Shanghai, Hong Kong, etc. And of course the corresponding rural areas and small cities too.
I would definitely be interested in a more international version, too.
Maybe 20-30 YO college graduates will do a tour of duty in SF/NY for fun, but nobody in their 30s+ is going want to go to bed listening to their upstairs neighbors party, their downstairs neighbors screw, and the homeless schizophrenic outside systematically shatter glass bottles on the sidewalk.
For retrofits of existing homes, geothermal whenever possible for upgrading central climate control when present. Current professional installers though are generally incompetent and price-gouge. Sufficiently motivated hackers can easily DIY.
I like what you're thinking about geothermal; it sparked a few thoughts, though: Geothermal is expensive because of labor, not necessarily price gouging. Most installers (in my area) put a 10% mark up on everything: finding competent installers is harder and harder these days, so you're paying more and more. HVAC techs can make six figures if they have the experience and aren't afraid to negotiate.
Unless, of course, they ARE price gouging, which happens, especially if they're already busy (which has been the case for the past three years).
The technology is solid, though to be frank, a high efficiency air-to-air heat pump will be comparable to geothermal, depending on the climate zone. If we can figure out a cost-effective way to replace gas-fired heat in northern climates with something more eco friendly (and consumer friendly), progress would be made.
If people are making 200k At faang what’s wrong with tradespeople making 100k when their skills are also in demand.
Everybody who supplies or resells equipment puts a mark up on it because there is some cost for them to select it, procure it, store it, transport it, finance it, etc.
In my experience, this is true. Comparing the electric heat system from one HVAC contractor to the ground source system I went with, there are a few expenses that are unique to ground source that start to add up. (USA - 2016)
$3,000 for the man and machine to dig the trenches and fill them back in.
Unknown - 2 days with a handful of laborers and 2,400 feet of pipe to lay and fuse together.
1/2 day to fill and purge the air from the pipes.
The equipment is more for sure, but I don't know an exact figure, but multiple thousands.
Overall, the 2 bids were about $15,000 different. Tax credits and even a rebate from the utility co-op made it close enough to a wash that I went with ground source and have been very happy.
When people ask about my monthly bill, the next question is "Why doesn't everyone do that?" I think many people don't care enough to deal with the sticker shock of a more expensive system. Another big reason is the amount of land required for horizontal trenching (there are other options, but this is generally the least expensive).
I also agree that if your climate is mild enough to use an air source pump, go for it. They don't quite cut it where I'm at (USA, just shy of Canada).
Put another way: the cost of operations isn't factored into the sales price. The builders don't have exposure to how much the buyer will pay to heat the home, so there's no reason to do extra work to insulate or seal better.
There are starting to be home efficiency standards that are being considered in the real estate markets (things like LEED certification for homes), but you're right, people are more likely to buy based on the look of the kitchen than the winter heating bill.
I've been trying to think if there's a financialization play possible... something like monetizing the downstream efficiency savings to incentivize the upfront work. The solar PPA model has been the solar company's monthly bill becomes your new electricity bill, and (ideally) paying off the solar installation monthly over 20 years was still cheaper than paying for non-solar electricity during that time. I've been wondering if something similar can be done for home efficiency upgrades.
It even more fundamental: no one knows about passive buildings. With homes, it's like we are still in the 50s with gas guzzling cars. That required a major oil shock to force a shift in fuel economy. No such impetus yet for homes, but when it comes, hopefully we have figured out a way to retrofit our housing stock cost effectively.
100-200mm of insulation on the outside of the house/loft/underfloor and triple glazing
It's not that trees can't provide the shade, but they take a while to grow enough to achieve the immediate effect of an awning. If you have a mature tree on a site, however, you could design a house that takes advantage of the shade it already provides.
Can I open my window more than 4” no, someone could fall out.
Can we prop our doors open to get a draft? No.
The reduced external surface area to volume ratio really improves efficiency though.
I'm not aware of having to notify the local council or power company about installing solar (you do have to get certification before you can receive money for generation though, but I don't think that costs significant money, its more a case of proving provenance of solar panels and equipment. )
Permitting involves things like making sure the design is up to fire codes (eg there are setback requirements to make sure a firefighter can put a ladder on the roof) and the interconnection work with the utility that lets you send energy back into the grid. In the worst states, these extra costs are largely artificial barriers (ie outright anti-competitive tactics).
I believe Germany for example spent a lot of work streamlining these all these permitting and interconnection fees, and as a result has much lower solar costs.
What this company appeared to have proposed in this contest is that by bolting solar onto pre-fab homes in the factory:
1. Installation costs drop, presumably because it's built into the design of the home and there's factory tooling around it instead of a retrofit system; and
2. Permitting costs go down because the design is standardized and presumably you can get economies-of-scale if you're connecting a whole subdivision or trailer park all at once.
This also nicely addresses the equity angle since prefab homes are generally aimed at lower income levels -- the claim here is that this building can drop the price of solar by almost half for this demographic.
Then I’d get 30% back from the government - and my state has no sales tax on this (which would have added 10%)
So, net cost to me of about $28k for a 13kw system
This does seem a bit pricey, but the tax rebate goes down a bit next year, so we will likely do it this year
Don't all UK installs over 6kW get counted as commercial still?
Basically the emotional meme here is to wrap oneself in a reality distortion field fueled by avarice.
If you removed the 30% tax credit, it would turn $3 a watt into the UK's ~$2 a watt. I know it's not that simple but it's not far from the truth. Even Tesla is trying to launch a product at $1 watt for commercial/industry clients.
solar + battery is a far better solution as it can soak up peak generation _and_ peak demand.
No way. The future of residential solar is coupled with storage. And the future of storage is in DC coupling, not wastefully inverting solar to AC just to invert back to put it in the battery.
How do Electric Transmission Lines Work?
The 20-50k that it would cost would just get absorbed into the mortgage and become an asset that would increase the value of the home - besides that, over the over a 30 year period, if you are able to capture even half that value in electricity savings, it becomes a no brainer as it would cost what a normal roof costs anyway.
There are also the benefits of more efficient energy transmission and resiliency of micro-grids as just bonuses.
I wish someone develops cheap adapters while construction code gets update to help panel frames.