

Ask HN: Should I charge my electronic devices with 5W, 10W or 12W power adapter? - zuck9

Time doesn&#x27;t matter, what is best for the battery? Like, iPads support 12W at most.
======
bgamari
Most (essentially all) personal electronic devices charge with a fixed supply
voltage (often 5 volts) and will draw up to some current. The product of
current and voltage is power. Since we have fixed voltage, the power rating of
an adapter is really telling you how much current it is capable of sourcing.
For instance, in the case of a 12 Watt 5V adapter, the supply is capable of
providing up to 2.4 Amperes of current. That is to say, the supply only
guarantees it can keep its output at 5V if the device draws less than 2.4A of
current. If the device draws more the voltage may sag which may cause the
charge process to terminate.

What this means for your question is this: all other things being equal, the
power rating of the adapter makes no difference to the charging process. So
long as the device does not draw more current than the adapter is capable of
supplying the charge process will proceed at the same rate.

~~~
pawn
I want to be sure I understand you correctly, because I'm not too confident in
this subject. You're stating that if my device is designed to run on lower
than 2.4A, but I charge it using a 2.4A charger, it'll frequently charge and
stop charging, then charge again? Does this mean also that I want to be
careful to not use a beefy charger for my phone, rather than thinking "bigger
is better"?

~~~
davismwfl
No, I don't think that is what was being communicated.

Basically, your device is designed to charge at a specific rate and it will
draw only up to a specific amount of current (amps). Having a device that is
capable of delivering more current doesn't mean your device will draw that
current. A cell phone for example will have a varying charge rate, something
that generally is between .5 amps and 1amp, with some devices capable of
drawing 2 amps. But if you plug your phone into a 5 amp charger when the
phones maximum rate of charge is 1amp, it will only draw 1 amp.

So in the end, bigger does not equal better, except to say that your highest
charge rate should be equal to or less than the charging device, which helps
keep heat to a minimum.

~~~
bgamari
Correct.

------
trimble-alum
Likely the smallest wattage will limit the maximum amount of waste heat the
batteries experience, which means less wear (physical, chemical, etc.
thermal/electrical damage). Also, opt to charge the device at the _lowest_
device operating temperature and with the most available cooling (battery side
up or standing near vertical, not in a sleeve under a blanket). And from
previous articles, charge from 50% up to about 70%, discharge, rinse-lather-
repeat. Store device for extended times at about 60% charge at the lowest
possible _non-operating_ temperature (probably 40-50 •F, 4-10 •C)

Beware: cold->heat too quickly often leads to internal condensation in humid
weather and extreme temperature changes, shorting out a device if ionic
impurities are on internals, when bringing a _cold_ device into a much
_hotter_ _or humid_ room too quickly. Instead, give it enough time to warm
_gradually_ , so condensation doesn't form (say limit temperature change to 10
•F / 4 •C per 30 minutes). Most devices still power some components while
"off," so a condensation short is a still a remote but plausible possibility,
which is why avoiding condensation is a good idea. BTW a "perfect" gadget
would be waterproof, float AND either include a hygrotherm to evaporate
thermal transition condensation or not have internal air pockets to prevent
condensation.

