Hacker News new | past | comments | ask | show | jobs | submit login
DC Grids (fraunhofer.de)
81 points by Gravityloss 8 months ago | hide | past | favorite | 89 comments



One of the fascinating challenges with HVDC is that a circuit breaker - an off switch! - is a surprisingly complex and expensive piece of engineering. If the poles aren’t separated quickly enough, an arc can form and current continues to flow. This is usually undesirable.

(Technically AC has the same problem, but the problem is much more acute for DC at the same voltage.)

You know the joke/story about how nasa spent millions of dollars developing a pen that can write in space, and the soviets used a pencil? Well, the circuit breaker version of this is that the west built crazy complex huge high voltage circuit breakers, and the Soviets just built, uh, “single use” circuit breakers. With explosives.

(I don’t know if this is true, I have never found a source, but it feels true, it should be true, I so desperately want it to be true.)


Explosive circuit breakers are standard in electric car safety systems (in case the regular one fails). I can't imagine anyone using them in normal operation though.

The space pen story is made up as well: https://www.snopes.com/fact-check/the-write-stuff/


It's not quite wrong either. The truth is more complex. See https://www.scientificamerican.com/article/nasa-spent-millio... and https://www.reuters.com/article/factcheck-nasa-pens-idUSL1N2...

I think any of us would balk at a government agency spending 1300 bucks on a single pencil (first link). And the graphite conductivity problem is a non sequitur, given that the Russians used wax (or grease) pencils, not graphite. This is really the point of the anecdote: wasteful government spending.

Though apparently the Russians also find the pens useful and bought some as well.


> I think any of us would balk at a government agency spending 1300 bucks on a single pencil (first link).

Only people who don't understand how engineering and manufacturing projects work. $1300 is completely reasonable for a very low volume manufactured product, which was specified to meet specific criteria.

The reason it sounds unreasonable is because it is being called a "pencil/pen", instead of what it actually was, which was speciality equipment designed to be compatible with the conditions inside of the spacecraft.

At face value in meme format, the implied "free" solution would be buying a pen or pencil off the shelf, when in reality, that alone wouldn't have solved the problem. It is possible that someone could have evaluated off-the-shelf options to see if one would have met the criteria, but that would still have significant labor cost.


> which was speciality equipment designed to be compatible with the conditions inside of the spacecraft.

Right, that's the point.

It's also how you get the other very expensive things that the Pentagon gets dinged for.


What everyone forgets is how valuable each second of astronaut time actually is. 1300$ for a slightly better pen is probably a great deal for the agency.

The shuttle program (excluding ISS) worked out to roughly 1 million dollars each hour an astronaut spent in space including when they were sleeping. Space stations are significantly better but anything that marginally improves astronaut effectiveness is worth quite a bit.


But those space pens are available to civilians now. In plastic disposable versions for a few bucks(uniball power tank).

They are the only pens I know that stand up to modern use where you might only touch them once a week or less. Now they exist, and probably will be available indefinitely, and cheaply, and the problem of pens that just work anywhere is basically solved.

Isn't part of the point of NASA to develop tech that benefits everyone? Granted it's not a major life changing thing, but it's not just tossing money in the toilet either.


As I understand it, the point of the explosive is to have finer control over when the fuse blows. An ordinary fuse is more or less a resistor, and it blows when part of it gets hot enough. There is a fair amount of error, and this means they work best when there is a considerable margin between the current they must carry without blowing and the conditions under which they must blow. Conventional thermal-magnetic or hydraulic-magnetic circuit breakers are similar.

AIUI some high performance cars may draw so much current under maximum acceleration that the fuse needs to be dangerously large to avoid blowing when flooring it.

The solution is an electronically triggered fuse. A reliable and precise electronic circuit detects excessive current and blows a small pyrotechnic charge that opens the fuse. The analogous technology for circuit breakers is fairly mature in the commercial/industrial world — you can buy an electronically tripped circuit breaker, and there is likely one in an office building near you.

(Electronic trip devices for circuit breakers have ludicrous list prices, and there is no way a car company would pay anything resembling those prices for a car component that lets them eke out a bit more performance. I bet Tesla’s cost for its pyro fuses is quite low.)


> Electronic trip devices for circuit breakers have ludicrous list prices

The last one I bought was a GFCI trip unit for a 1000A 480v Siemens breaker and it was around $6000, and that doesn’t include the cost of the circuit breaker.


That's pretty impressively expensive, especially given that the innards should not have any particular dependence on the rated current. (I suppose the GFCI sensing circuit needs to tolerate an increased amount of induced current as the available fault current goes up.)


All switches have this problem, even low-voltage or AC ones. Slow switches, or switches that bounce, create arcs. This damages the contacts and can be hazardous.

This is one reason switches are "clicky". The action of completing or breaking the circuit must happen quickly. Switches have springs in them, which ensure the switch goes between the two extremes as quickly as possible. The springs oppose the motion for first part of the travel. Partway through the travel, they suddenly start to assist the motion and force the switch the rest of the way.


Rather than separating the conductors more quickly, wouldn't it be preferable to replace the space they occupied with a material in which arcs cannot form? Or is this a cost/risk thing? e.g. I know SF_6 is sometimes used for this purpose, but that is problematic as it is a potent greenhouse gas.


A typical miniature or molded case low-voltage (under 1000V) circuit breaker will have arc chutes to extinguish the arc. [0]

Higher power low-voltage circuit breakers as well medium and high voltage breakers can use oil [3], (compressed) air [1], gas [4], and vacuum [2] to extinguish the arc flash.

SF-6 is used for high voltage applications while vacuum and air are common for medium voltage gear installed indoors, oil breakers are used in outdoor installations at utility substations and similar installations.

[0] https://wiraelectrical.com/what-is-an-arc-chute/

[1] https://www.se.com/us/en/faqs/FA360729/

[2] https://en.m.wikipedia.org/wiki/Vacuum_interrupter

[3] https://www.electricaltechnology.org/2021/08/ocb-oil-circuit...

[4] https://www.electricaltechnology.org/2021/08/sf6-sulphur-hex...


Most high voltage breakers already are SF6.

They rely on the 0-point crossing to quench the arc.


But is there not 0-point crossing in DC currents ?


The current is always positive except when it's turned off. It never reversed so it never passes through 0v to get to negative volts. Which is also the big advantage, since you're not spending part of the time at lower voltage.


Like one metal blade that is placed in between the wires (current flows) and slid up to an insulating part (current does not flow) ?


I am pretty sure if it was that simple it would have been done that way already. Breaking high currents is something we had to do for more than a hundred years.

Arc flashes are no joke. We are talking about 2800 to 19000 degrees Celsius here. I don't know about your insulator but it has to survive a multiple of the surface temperature of the sun. And ideally it withstands that more than once or twice.


Thats not that easy. As soon the two metals are seperated, there will be an arc. The arc is really, really hot and will burn everything and also as soont there is an arc, the arc stays can bee so much much longer in discance than just air. Since there is 0 in DC, large DC-switches are much more difficult to build. There are several options from SF6 (today no), to vacum or blowing magnets (blow the arc to cooling cambers).


Would it be feasible to use induction to briefly induce a reverse current on a short section of the line that is of the same magnitude as the forward current so that there won't be arcing when you break the circuit there?


Kinda; actively sucking current/voltage difference away from a circuit breaker can allow you to interrupt a long DC line/part of a larger DC mesh without interrupting sufficiently distant users. Think train line/streetcar grid.



Tesla has pyro-fuses in their newer inverters and I believe in some battery packs as well.


Worth noting for each pyro fuse in the cars what it protects against.

The fuse in the motor inverters protects against failure of the mosfets (switches) inside the motor inverter. If any one of those switches get stuck 'on', the motor ends up doing full-power braking. That happening while driving along the highway would be catastrophic - hence the pyro-fuse to prevent such things.

The fuse in the battery pack is to prevent a short circuit anywhere in the high voltage system of the car short circuiting the battery. Obviously a short would cause some pretty huge currents to flow, probably causing the battery to overheat in a few seconds, giving out large amounts of flammable gas, which would immediately ignite (TV-style fireball explosion). And again, the pyro fuse prevents that.


What's the difference between a single use circuit breaker and a fuse?


A fuse is triggered by excessive current flow on the line that it breaks. A single use circuit breaker is triggered manually or by some external signal.


Nobody is confused about how to make a single use circuit breaker. But things like train stations have a need for repeated DC circuit breaking at high voltages.


I'm looking at an AC wall socket, with a 5 places multi plug with plugged in 2 phone chargers, my laptop's power brick, a fan. All of them are transforming AC into DC (well, maybe not all of the fan.) I guess that the problem here is that all those DC appliances have different voltage inputs so a hypothetical DC wall socket maybe would be 12 V or multiples of that and then we'd still need transformers to a different voltage. On the other side all my AC appliances are standardized at 230 V.


The German wall socket (Schuko) is actually rated for 250V 10A DC. Or 250V 16A AC.

A DC plug isn't hard, considering this already assumes the user will pull the arc by ripping the plug of an e.g. flaming appliance.


I think that the most common DC plug is a USB one. As you said, we pull the plug. I never saw an arc, not even in the dark, but they could be very tiny or the involved Vs and As are too small.

USB A single space modules for the Euro standard wall plugs are in stock at every shop. USB C are less common. The point is that all of them are 5 V and about 1 or 2 A, some about 3A. Furthermore they are still AC to DC converters inside the wall instead that outside of it. There is not a single central converter in the house.

See some examples of those modules at https://www.amazon.it/bticino-living-usb/s?k=bticino+living+...


DC will one day be ubiquitous, but I think that day is >50 years away.

Today, DC is common in small 'islands'. At a small scale, you have DC in your phone charger and USB power supplies. At a medium scale, DC is used in high speed electric car charging. At a large scale, DC is used for undersea electricity transmission.

DC has benefits of better making use of available conductors and insulators - for a given mass of copper and plastic, more energy can be transferred from A to B at a given efficiency. Modern DC/DC converters can convert voltages more efficiently and using less metal (ie. cost) than AC transformers.

However... AC is still the standard. And changing power standards is awfully slow, because power cables in the ground can easily have a lifespan of 50+ years, and there is a chicken and egg problem involved with deploying a new standard.


> However... AC is still the standard. And changing power standards is awfully slow, because power cables in the ground can easily have a lifespan of 50+ years, and there is a chicken and egg problem involved with deploying a new standard.

The lines can theoretically stay the same, at least in the distribution network - you'd "only" need to exchange the equipment like transformers and switches.

DC has some pretty challenging aspects in implementation, and that not just on the large grid scale:

- changing voltages requires active semiconductors instead of a (relatively) dumb transformer. This has been lessened by technological advances, but it's still more expensive.

- switching DC loads on and off is harder because the voltage never crosses the zero threshold - this is also the reason why relays and circuit breakers are always rated for way lower currents in DC than in AC, and usually have lower cycle ratings as well as there will be an arc that continuously burns.

- the same is also true for ground faults, say a tree branch: the arc isn't "automatically" extinguished once the voltage drops (which happens every 1/50 second)

- it's more difficult to have an actual grid, most current implementations are point-to-point only

- DC introduces the potential for very weird "stray currents" and resulting electrochemical corrosion

- unlike with multi-phase AC, the magnetic forces in a cable that are generated by current flow don't cancel each other out, so that needs to be taken into consideration to verify if cables are suited for DC transmission


Can you (or someone else) write more about the advantages of DC/DC voltage conversion? If I understand correctly this was the main advantage of AC, and the reason why was it chosen.


In the olden days, AC allowed easy voltage conversion with a transformer. The transformer converts the electricity into magnetism in a steel core, and then back into electricity at a different voltage.

This process is quite efficient but requires a lot of steel, since in a 60 Hz AC system, 1/120th of a second of the energy being converted has to be stored as a magnetic field in steel - and steel isn't a particularly good 'store' of magnetic fields...

Modern DC/DC systems actually have similarities! But instead of operating at 60Hz, they tend to operate at more like 1,000,000 Hz. That means far less copper and steel is needed. Unfortunately, 1,000,000 Hz power has a habit of leaking out of cables and becoming radio waves, so we can't send it long distances like that - so we convert it to DC before and afterwards. The conversion to DC is done with electronic switches switched at 1 Mhz or more - usually MOSFETS are used, and one promising but expensive type is a GaN MOSFET. It turns out that the DC->AC, transformer, and AC->DC setup can also be combined and simplified a bit, and we call the result a buck/boost converter.

Overall, buck/boost converters can normally convert DC voltages for less money and at higher efficiencies than their AC transformer counterparts - mostly due to the higher operating frequency allowing use of far less steel and copper, and allowing other engineering tradeoffs be made in the direction of efficiency.

However, neither DC/DC nor transformers have any theoretical cap on efficiency - and with an unlimited budget, you could make either with an almost arbitrarily high efficiency.


Citation needed on DC/DC converters being as efficient as AC transformers. I'd agree they can cost less, but AC transformers are wildly efficient. 99% efficiency is not unheard of. Full load efficiency can be extremely high too.

There are basically no DC/DC converters that hit that efficiency at any load.


Give the benefit of doubt and read the surrounding context more carefully.

Not only did they mention (rephrasing) for unlimited money / resources AC / AC transformers could be far more efficient, but the part you are critiquing is simultaneously comparing COST for SAME efficiency parts (I assume typical) and resulting efficiency for similar COST.


AC transformers have fixed thermal power density, but load power scales with mass to the 4/3rd power (i.e., scales with linear size by the 4th power, while volume and mass and loss only scale to the 3rd power).

And no, 99% isn't hard for a resonant switched capacitor converter. They just happen to be restricted to integer voltage ratios. (With sometimes a few percent regulation around this ratio without substantial efficiency loss.)


Changing properties of electricity (ie current to voltage or vice versa) requires a trip through the magnetic field and that means varying the current through a conductor. With AC that's easy as it's already changing and so you can trivially (passively) convert between voltages at the same frequency with the efficiency cost of heating the conductors of the coils used in the transformer.

With DC it's harder because you don't have the time changing nature necessary for the magnetic field so you have to turn the DC on and off which requires a switch. Nowadays we have very fast switches (transistors) that allow us to tune a circuit to the power required and temporary energy storage (capacitors and inductors) available. Ignoring (or shielding) the RF interference that's created with fast switching we have systems that can efficiently convert between one DC voltage and another.

I'm not so sure we'll have DC to the home for supply, a zero-crossing is helpful to keep circuit breakers small and reduce damage in brief, accidental contact (eg broken insulation on a lamp etc).


- AC motor are very easy to build and are very, very durable. - AC / AC voltage transformation is very easy and very effective.

- Switching AC is much more easy than DC.

- The grid we have today was not made for small small cellphone chargers. It was made for light and motors.


One big advantage is that you don't have a wave you have to sync with (Which is why DC is used for connection between grids). One of the harder parts of AC grid management is that every generator on the grid has a timing component to make sure it's producing power in the same waveform as the grid. It's not enough to produce AC at 60hz, if that 60hz is misaligned then you generator turns into a load on the grid.

It also means that if 2 grids aren't in sync, they can't connect (even if they are both 60hz) without some expensive equipment to sync the waveform. A 60hz grid cannot connect with a 50hz grid without a DC phase.


There will never be a switch to DC in the home. The advantage is small, and it would involve changing everything. Would have to replace every outlet and switch, every appliance, every light socket and light bulbs. The only way that would happen is if building in new place, like on Mars.

There aren't any standards for DC in the home, no plugs, and no voltage. The voltage would be pretty high, 480V is likely, that would have to step down for USB and lights. It is telling that boats and RVs, which use 12V and 48V, just have inverters and AC plugs.

The only place where DC in home makes sense is between battery backup and solar panels. That way can have one, large, efficient inverter instead of inverters on the batteries and each panel. There isn't much difference between AC inverter and DC power converter. Although, I think will still need DC converters between solar panel, which varies, battery, and inverter/DC wiring.


Lights and TVs need a fraction of the power than they did 30 years ago.

That means they can run off USB C even though it's only speced to 180 W.

Converting an electric geyser to run from USB C should even be practical: Especially if it's a bachelor who typically only use it for showering.

https://en.m.wikipedia.org/wiki/USB_hardware#USB_Power_Deliv...


Maybe. It sort of depends on how renewables shake out.

I could see DC gaining in popularity because you don't have to invest the hardware to convert renewables to AC for transmission. But at the same time, IDK, The AC grid works just fine so I have a hard time envisioning replacing the whole grid with a DC grid. Just doesn't seem like there'd be enough gains.

I do think intergrid connections will be more common. Micro/macro grids might also be more popular. Perhaps we start seeing subdivisions with their own grid/battery backups to improve reliability and allow the overall grid to disconnect them temporarily under load?


> Today, DC is common in small 'islands'.

I'll take those 'islands' to include examples you mentioned.

Outside such specific cases, advantages of AC usually outweigh advantages of DC. Or: DC dis advantages outweigh those of AC. Especially safety related.

In short: AC will stay for utillity scale & in-home power distribution. Regardless of history.

But within eg. a solar farm, or a vehicle, yes DC may be more practical. And thus... used there.


San Francisco has a little known but well used (mostly by older elevators) DC grid https://spectrum.ieee.org/san-franciscos-secret-dc-grid


I dont know how relevant this is from grid level, but the Hotel Marcel refurbed a building with DC power. People point out how they use PoE for room lighting, and there are more benefits. https://www.smartbuildingstech.com/intelligent-building-syst...


A net-zero building next to a six-lane road next to a ten-lane highway. I'm not sure if the lighting in that building is the right thing to focus on.


By that argument you can never focus on anything, since there will always be a worse offender somewhere.


That's right: we should postpone all other attempts to improve our energy infrastructure till our network of highways is destroyed or at least reduced to 2 lanes everywhere.


It was really mind blowing to me the first time I did an inventory in my house of appliances and gadgets that had to convert AC back to DC after being collected as DC and converted to AC. Not just the electricity being used but also just the wasteful resources needed for wall warts and built in transformers.


But this is about 380V DC which is not going into your living room anytime soon.

Sadly, it is less safe than AC. Automakers tried to increase DC voltage beyond 12V but it causes sparks that cause mechanical switches to fail. AC sparks are extinguished whenever the voltage wave crosses zero. Ubiquitous power DC needs high power/high voltage solid state switches to become cheaper and more reliable than mechanical ones. Perhaps SiC or GaN transistors will do eventually.


re: 12V automotive LV systems: it’s a little more nuanced than that. teslas cybertruck will be running a 48V LV system[1] and while relays fuses efuses and DCDC converters across the LV system have to be rated for higher voltages - you gain efficiency back with I^2R losses across the entire harness, and can drop your required wire gauge since the necessary current carrying capacity is reduced by 4.

So it’s a nuanced trade off, and if the industry shifts (which tesla is banking on since they’re the ‘leader’) then economies of scale can be reached with higher voltage fuses switches relays etc.

[1] - https://auto.hindustantimes.com/auto/electric-vehicles/tesla...


It is possible for high voltage DC to be made safe. Currently, techniques to do so are neither cheap nor off-the-shelf.

For example, imagine I want a 3000 volt DC wire to power a portable air conditioner. The air conditioner will be 10 kilowatts, so 3.3 amps. The wire can be thinner than headphone cables (two 0.3mm conductors, +-1500 volts, 150um PTFE coating) if desired.

Obviously, with such a thin insulation, the system needs to be human safe when chewed through by a baby. To ensure that, the current flow through the baby must be under 1 milliamp, or 10 milliJoules through the baby's heart. That can be ensured by tracking the current through each conductor, accurate to 1 milliamp, and shutting off the supply if there is ever more than 1 milliamp unaccounted for (either to earth, or to the other conductor). When the shutoff occurs, it must therefore happen within 1 microsecond (assuming the worst case fault, that is all three amps direct to the baby's heart). That in turn puts capacitance and therefore length limits on the cable - it wouldn't be possible for this cable to be safe longer than ~1000 feet.

TL;DR: It is very possible, with today's technology, to design very high voltage DC systems safe enough for use within a home. However, no hardware available off-the-shelf yet can do this, due to no demand.


Good catch with the capacitance. One then needs to consider parasitic capacitances to stuff outside cable, too. It might end up necessary to make cables with integrated protection circuits along whole length.


Or just use thicker insulation? Not aware of any AC unit that uses such a thin power cord.


It is unlikely any future high voltage DC system would be sufficiently safe without tracking leakage current as I outlined. Partly because it is almost impossible to stop someone cutting through the insulation, however thick (eg. with a kitchen knife, lawnmower, fire, etc).

3000 volts DC is a "definitely dead" voltage, as opposed to current 110 volt AC systems which are "you'll probably survive" if you use a kitchen knife to cut through the insulation.

Given that you need the protection systems in place anyway, there isn't much point in thicker insulation, unless you like your cord being more cumbersome, heavier and more expensive.


Thanks for the explanation. I wonder if you can detect integrity issues in the insulation instead and use that to shut down the system instead.


They would have to do conversion regardless. It might be a little more efficient DC DC, but it's not like you could just have 5v lines everywhere, you might wind up using as much copper in the thicker lines as you do in the power adapters.

Especially now with USB C. I suspect in 20 years a lot of today's PD supplies will be perfectly good. It's less wasteful when they're that reusable.


The #1 purpose of such wall warts is to provide safety by means of galvanic isolation (in use, any part you can touch on wall wart or device it powers, has no direct electrical connection to HV side). That's where a transformer comes in.

When using a transformer, the voltage conversion comes 'free'. Modern electronics makes this smaller, lighter & using less metal (not more reliable, btw ;-)

On the generation side (for example rooftop solar), it isn't a big deal to have 1, powerful, high-efficiency converter.


Yes though AC is still safer and easier to transport. It's just the conversion that is wasteful. DC grids might be a good option but not without risks, especially at higher voltages.


There is a DC long distance line from the Dalles, Oregon to LA that has been around since the early 70's. Its been upgraded a few times, and now runs 3GW of power over it. its very distinctive from the other lines nearby, since it only has 2 wires:

The grounding loops are very impressive. 6 mile loop of buried cable at either end.

https://en.wikipedia.org/wiki/Pacific_DC_Intertie


It's generally understood now that using the earth for transmitting electrical current is a bad idea... It causes more corrosion in building foundations even hundreds of kilometers away, and soil microbes that navigate with electric fields die.

Therefore, most new DC transmission systems have a balanced pair of cables, and only use earth return for emergencies


The article says that the intertie has one +500Kv line and one -500Kv line. So why does current need to travel via earth? I would think the DC current flows in a loop via the two conductor lines.


I might misremember, but aren't some of the submerged sea power cables (Norway, Germany, UK) DC? If so, why? And how does that rhyme with AC being more efficient?


There’s a dedicated DC trunk line running down the entire west coast. It’s how hydropower generated in Washington state is sold to California. High voltage DC to bridge grids is really common.

https://en.m.wikipedia.org/wiki/Pacific_DC_Intertie


A whole bunch of different effects.

AC/DC have different costs for voltage converters and per-distance efficiency, so there are some distances where AC makes more sense and others where DC makes more sense; the distance changes as tech improves.

AC mostly conducts on the outer surface of the wire, while DC conducts with the whole cross section, giving you different scaling issues as the current changes.

Under water, AC suffers from significant capacitive loss — the wire acts as one side of a capacitor and the entire ocean as the other.

At certain frequencies and wire lengths you also get inductive losses, though IIRC that affects only RF cables in practice and and the design of the trans-Siberian railway in theory as no other place even seriously considered having a sufficiently long conductor for the frequency used.


I don't know if it is the same issue but AC lines longer than a certain distance will approach the wave length of an antena and start to radiate energy (like a radio broadcast antena). In that case you use DC. When I studied this back in school I was told there was such a line in South Africa (if I recall correctly).


As others mentioned, undersea cables suffer from capacitive losses. But another big factor is grid synchronisation: two AC grids cannot be joined if they are not perfectly synced. Their phase must match. DC having no such phase, it’s very useful to connect independent grids (e.g. UK to the rest of the EU, or the three main grids in the US)


Disclaimer that I'm not an expert. As far as I understand, AC runs on the outside of the cable rather than through the conductor. DC does not. That's one of several reasons why.


My understanding here is that the saving comes from needing fewer cables, although I’m sure there’s a lot more to it than that.


AC is quite a lot more dangerous than DC for humans at ~hundreds of Volts. ElectroBOOM has made a nice demonstration of this: https://www.youtube.com/watch?v=snk3C4m44SY


DC arcs are much harder to extinguish. Switching 10A of 240v AC is trivial. Switching 10A of 240V DC is much more complicated in terms of switchgear. Sustained arcs are dangerous.


For more information regarding a Dutch initiative, in collaboration with Schneider Electric visit: https://www.dc.systems/

(I'm not affiliated, just know of the project due to my work in DC grids in offices)


Wasn't this settled during Tesla and Edison? Their argument is that renewables are mostly DC. It looks like they mostly care about combination of SPV and battery storage.

Case 1: SPV 12 kWp @ 200V -> Inverter (eff=96.2%) -> 200m of 8 AWG wire -> AC-DC switching power supply (eff=92%)

Case 2: SPV 12 kWp @ 200V -> 200m of 8 AWG wire -> DC-DC buck converter 90%

If first case we get 9.16 kW. In second case we get 7.596 kW. In Case 2 you need the buck converter because the DC voltage will vary with temperature of the cable itself. Voltage loss in this case will be roughly 56V. In both cases I disregarded the effect of MPPT as it will be the same in both cases.

This suggests that when the cables are short enough (probably about 50m) then it might make sense. For anything else we are probably better off with AC networks.

One remark is that an industrial warehouse with rooftop SPV will probably need AC too. This might work for single family houses though.

With AC you can step up to higher voltage with similar efficiencies which will lead to lower transmission losses. I kept the voltage at 200V for simplicity. But going from 200V to 800V will lead to transmission loss of 3.43% instead of 13.73%. This way AC is clear winner.


The "war of the currents" was between Edison and Westinghouse. Tesla worked for both of them, but had very little influence on this issue. In the 19th century, using AC power was necessary because transformers were the only way to generate high enough voltages for long-distance power lines. This is no longer the case in the age of high-performance semiconductors.

In your calculation, you are comparing apples to oranges. An "AC-DC switching power supply" is just a rectifier followed by a DC-DC converter. There is no way for your "Case 1" to be more efficient than your "Case 2" because it simply contains two more conversion steps. This is exactly what the Fraunhofer project is aiming to eliminate.


Feel free to re-calculate the numbers and point at the mistake. I calculated everything to the best of my knowledge.

At the same voltage, DC will in all usual cases have higher losses than AC. This is basic physic.

Non-constructive critique helps nobody.


I think with new(er) GaN(or SiC) DC/DC buck conversion, 380V systems can get up to 98.2%, typically around 94% efficiency now.

Edit: And voltage of these type of DC grids are often higher (600V - 1000V, or industrial even up to 1500V), so losses in cables are less.


AFAIK PC power supplies are actually one of the most efficient power supplies out there so I'll use them as a reference. The Titanium class of power supplies is around 94% efficient at 100% load and 90% at 20% load [1]. On the other hand plain old transformers start at around 95% and can go even higher. Random internet source claims that it can be 98.5% efficient [2]. Of course you cannot run your computer at AC. I had mostly grid-scale equipment in mind. I just wanted to use something relatable as an example.

[1] https://www.velocitymicro.com/blog/what-is-psu-efficiency-an...

[2] https://www.electricaleasy.com/2014/04/transformer-losses-an....


Yes. The original commenter is a bit behind the times. There is also reliability advantages.

> new(er) GaN(or SiC) DC/DC buck conversion

Can you share guide lines about costs?

For comparison, utility scale AC inverters for PV are ~$5-$10/kW


Original commenter might be a bit behind on many things but this is actually my day job.

Do you have a source for the reliability claim? There are so many good old transformers deployed around the world. Many of them working for 10s of years without a replacement. I'd be suspicious that a buck converter will have longer MTBF than oil submerged transformer.

Also, I don't see why it is mentioned here but NREL estimates the cost of grid scale inverters (for installations of 100 MW) to be 5-10x more than what was mentioned. [1]

[1] https://www.nrel.gov/docs/fy22osti/83586.pdf


> but this is actually my day job.

You are probably more knowledgable than me. I am just here for the cake.

However, AFAIK the most fault prone components in PV are inverters and (where applicable) transformers.

> 5-10x more than what was mentioned.

This was my mistake. I started with 5c/W and missed a decimal place.

None of this addresses the key issue - you mentioned 10% loss for a simple DC converter, and the next commenter [1] mentioned the newer generation of far superior alternatives.

[1] https://news.ycombinator.com/item?id=37260441


They didn't have frequency converters / fast high power transistors back then. With AC, AFAIU, it's easier to change voltages with primitive equipment. But then you have to have the frequency the same everywhere.


What about capacitive and inductive losses. These are significant in AC transmission lines, depending on their construction. I believe that's why high voltage interconnects between different countries are usually DC and use rotary convertors at the ends.


The main reason for HV DC historically was that it doesn't require grids to be synchronous, or even of the same frequency. This is often desirable because a synchronous grid requires deep organizational integration. HV DC also lets you use mono-pole systems fairly easily, since they're single phase; sea or ground becomes the return path.


They are both taken into account in the calculation. I assumed power factor of 0.85.


AC has the great advantage of being well supported. However, scratchpad calculations can't prove that.


For non industrial use cases there is: https://open-dc-grid.org/ which appears to be suspended for now.


I think an open DC grid standard would be amazing, but I'm not so sure about 48v.

What I would do is probably just use 12v nominal for everything, and use voltage levels to signal instead of trying to do real smart communication.

11-14.4v, you're running on battery, if you're a battery, supply till you get to that range. 15-18v, you've got solar, if you're a battery you can charge.

Anything more can be figured out later.

You could have different "Tiers" for other voltages too, but 12v seems to be fine. Inverters are cheap-ish and would be even cheaper if they were used more, just use that for "real power", and optimize the micro grid for what it's really good at, portable and very small setups with a few hundred watts total power.

Enough things are only used intermittently, and we have ways to make batteries safe, might as well put the batteries closer to the load.

If you have something super high power, like a kettle, it can do it's own step up to 120v or 48v, but participate on the bus as a 12v device, and just slow charge at a few amps.


48V DC is already in the dangerous voltages to work with in the home. The human body can already bridge such a voltage, but as opposed to AC, muscles will freeze in position, effectively getting stuck in shock position.

I remember visiting a datacenter once that ran telco equipment on 48V DC. They were much more paranoid about us getting close to such equipment compared to the AC equipment because they de need to "unstick" us. Cool gear nonetheless




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: