Hacker News new | past | comments | ask | show | jobs | submit login

Short answer: NO.

This question seems to come up every time there's a discussion like this about low-voltage power. The whole reason we use high voltages in the first place is to reduce resistive losses. Ohm's Law (V=IR) plus the formula for power(P=VI) tell you everything you need to know, which is that power is related to current and resistance by the equation P=R*I^2. So for a given amount of power, if you increase the voltage by a factor of 2, you then decrease the current by a factor of 2, and you reduce the resistance by a factor of 4, so the power lost in transmission is 1/4 as much.

If you tried putting 5V USB ports in your house, you wouldn't be able to use wire to connect them to the central power supply; you'd need to use massively-thick copper bus bars. The cost would be astronomical, and you'd need huge channels on the walls to hold them.




To give some context, 5V @ 1Amp takes ~12AWG, which is around the normal size of wiring installed in a typical 120v 20A circuit in the US.

Any more current than 1-2A the wiring gets silly. At 5A, which is around the power levels used for LED lighting, you need 6AWG (13 mm^2) wires.

These figures were calculated assuming an average of 40' of wire from converter to outlet. A per-room converter would be more reasonable, but you might be better off doing something like using 120vDC so every non-motor device can drop the bridge rectifier.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: