Hacker News new | past | comments | ask | show | jobs | submit login
mAh Is an Industry Mistake (reddit.com)
77 points by tosh on Oct 12, 2023 | hide | past | favorite | 122 comments



This has bugged me for so long and I hope the industry finally stops someday.

To summarize for anyone not versed in electrical power:

* mAh measures how much current your battery holds, irrespective of voltage.

* However, actual POWER is measured in Watts (or watt-hours cumulatively).

* Watts = current * voltage.

* A 2000mAh hour battery at 2 volts has half the power of a 2000mAh battery at 4 volts.

* As voltages can vary based on battery arrangement (parallel vs series cells), this makes a huge difference. You can half or double your mAh by arranging your cells differently, without changing actual power stored.

Using Wh or mWh instead of mAh would make this whole problem go away. But then that means low voltage batteries (like often used in phones) can't inflate their reported mAh compared to high voltage cells (e.g. in power tools). They also tend to conveniently leave out the battery voltage, so you can't tell when it's an apples to apples capacity comparison.

It's so silly. /rant


I 99.9% agree with you. Some nitpicks, since this is a discussion of getting units right:

mAh doesn't measure "current", because current is the flow of electrons. It measures charge.

2000 mAh hour is redundant - the 'h' in mAh is hours. :)

But your basic point is 100% correct: All batteries should show their capacity in Wh.


It's also redundant because with 2000mAh... "thousand milli" can just be eliminated, leaving us with 2Ah.

Everyone constantly referencing thousands of thousandths is another one of my "favorite" things about the industry using mAh as a standard measurement of battery "capacity" (besides the fact that they almost never specify voltage or watt-hours).


Don't quote capacity. It is the correct tern, physically. Capacity is the amount of electrical charge something can store, Ah is unit of that. Not the natural unit though, that would be Coulomb (C).

Why use Charge and not energy? Because the voltage changes as the battery is discharging, so it's not as straight forward as just multiplying with the voltage.


> It is the correct tern, physically.

Agreed, but not from a consumer perspective, where it is very confusing that a battery with half the capacity might have twice the energy due to differences in voltage.

From a consumer perspective, the battery capacity is the amount of energy, regardless of whether that is wrong.


I don't think "capacity" neccesarily has a concrete unit connected to it. I am pretty sure you could call both the charge and the energy in a battery its capacity.


> Not the natural unit though, that would be Coulomb (C).

To be pedantic, you mean SI unit. The natural unit of charge is the electron (e).


> It's also redundant because with 2000mAh... "thousand milli" can just be eliminated, leaving us with 2Ah.

But then you'd have to do more conversions when comparing. I've seen batteries with capacities as low as 10mAh. 4 digits is still a reasonably practical number.


In the consumer space, virtually no battery I've seen in more than a decade is less than 1000mAh. If you looked at the Reddit post, you know we're talking about laptops, tablets, phones, external power banks, that type of consumer-facing application. But, even then, converting is fine. People understand there are 1000 meters in a kilometer. I don't think any of this is rocket science.

However, outside of the consumer space... doing conversions like that is also fine. When browsing DigiKey, components are often listed in the optimal unit prefix. When necessary (not so much with batteries), the interface allows you to filter based on ranges you provide, and it will perform the necessary conversions for all the products to give you an accurate list.


> In the consumer space, virtually no battery I've seen in more than a decade is less than 1000mAh.

Behold:

https://www.amazon.com/Panasonic-BK-4MCCA4BA-eneloop-Pre-Cha...

https://www.amazon.com/PANASONIC-BATTERIES-CR1216-LITHIUM-BA...

https://www.amazon.com/SR521W-SR521SW-LR521-Alkaline-Battery...

> People understand there are 1000 meters in a kilometer. I don't think any of this is rocket science.

Yes, but it's one more mental operation, and every mental operation like that is relatively costly. People don't actually have that many slots of working memory. You are optimizing to minimize digits, which is probably the wrong thing.


>> virtually no battery

> Behold <a single counterexample that is far less commonly encountered by consumers than phones, tablets, laptops, or external power banks>

EDIT: Ok, now you have updated your comment to also include coin cells, which is even less relevant. This just shows you don't appreciate what we're talking about here.

Consumers care about the amp-hour capacity of larger devices for various reasons. They need to figure out how many times they can recharge a phone from an external power bank. They want to understand how much less efficient their laptop is than their tablet. That's not what they do with coin cells or AAA batteries.

You have also ignored that we can express 0.8Ah. It doesn't have to be 800mAh vs 2Ah. Since things less than 1Ah are far less common for consumers, it would be logical to just show everything in Ah, and then represent those smaller things as fractional Ah, if we're afraid of multiplying and dividing by 1000.

We could represent all distances in millimeters just in case, but we don't.


> Behold <a single counterexample that is far less commonly encountered by consumers than phones, tablets, laptops, or external power banks>

People still use AAA batteries, and that probably the most well-known, premium brand of rechargeable AAAs.

> Consumers care about the amp-hour capacity of larger devices for various reasons. They need to figure out how many times they can recharge a phone from an external power bank. They want to understand how much less efficient their laptop is than their tablet.

But they do care about the metric prefix? mAh far more established and slightly more convenient to use than Ah for these kinds of batteries.

Edit:

> You have also ignored that we can express 0.8Ah. It doesn't have to be 800mAh vs 2Ah. Since things less than 1Ah are far less common for consumers, it would be logical to just show everything in Ah, and then represent those smaller things as fractional Ah, if we're afraid of multiplying and dividing by 1000.

I haven't ignored that, it's just not a very appealing thing to do and doesn't answer any of the problems with switching the customary prefix.

Yes. It would be possible to relabel all consumer batteries with Ah instead of mAh, but why would anyone want to go through all that trouble?


> People still use AAA batteries

Consumers do not know the amp-hour capacity of coin cells or AAA batteries. They don't know and they don't care. Consumers put a coin cell into an AirTag and come back a year later to replace it when their iPhone tells them to. They don't care about the capacity. It's not relevant to this discussion at all. The percentage of consumers who care about the capacity of those types of batteries is certainly tiny.

> But they do care about the metric prefix? mAh far more established and slightly more convenient to use than Ah for these kinds of batteries.

It is not more convenient, since consumers do not think about <1Ah values, virtually ever.

> Yes. It would be possible to relabel all consumer batteries with Ah instead of mAh, but why would anyone want to go through all that trouble?

And I repeat: We could represent all distances in millimeters just in case, but we don't.

LA is only 4,506,163,000 millimeters from NYC. Very convenient.


> Consumers do not know the amp-hour capacity of coin cells or AAA batteries. They don't know and they don't care. Consumers put a coin cell into an AirTag and come back a year later to replace it when their iPhone tells them to.

Then why label batteries with a capacity at all, if consumers don't care so much?

They care when they're buying batteries and they have a choice of different ones to buy.

> It is not more convenient, since consumers do not think about <1Ah values, virtually ever.

Even if that were true, it doesn't justify changing the customary unit.

> And I repeat: We could represent all distances in millimeters just in case, but we don't.

> LA is only 4506163000 millimeters from NYC. Very convenient.

And people could report their height in meters, yet they frequently use centimeters.

Among other things, we don't use millimeters for long distances because 10-digit numbers tend to be an awkward number of digits to work with, but 4 or 5 digits are still pretty easy and natural to work with. I wonder how many digits the LA to NYC distance is in kilometers? Well, wouldn't you know it, it's 4.

Anyway, this conversation isn't going anywhere. Batteries will continue to be labeled in mAh, even if you have a strong personal opinion that everyone else is using the wrong prefix thousands of companies and millions of people just need to switch to Ah now.


We label batteries with capacity in cases where said capacity is tiny compared to user's needs. It allows user to guesstimate if their phone will last 1 or 2 days on a single charge.

For AAA batteries, which in typical use last at least several months the exact capacity number becomes largely irrelevant.


> They need to figure out how many times they can recharge a phone from an external power bank.

Watt-hours to the win! No more 10'000 mAh at 5V (nope, it's 3.7V) weirdness


I agree completely!


How about Apple Airpods then? I've found 43.24 mAh and 49.7 mAh depending on the version.


> "thousand milli" can just be eliminated, leaving us with 2Ah.

They call it marketing.


Have you ever heard the word "half-pair"?


For future rants: mAh is a measure of charge, not current. And batteries don’t have “power stored” — they have “energy stored”.

When you make a statement like “A 2000mAh hour battery at 2 volts has half the power of a 2000mAh battery at 4 volts,” it’s a bit nonsensical — a battery does have a certain amount of power available, but that amount is not charge * voltage, nor is it even necessarily proportional to the energy stored.


Historically batteries have had known standard voltages (12V, 6V) and within that context, mAh is easily convertable to Wh.

With USB PD and powerbanks etc that operate at many voltages using mAh is unclear. It's only used historically as a number comparable to previously used numbers. As someone mentioned the implied voltage is that of the cell 3.7V

So the conversion formula is Wh = mAh/1000 * 3.7V (or substitute the standard voltage for the context)


The main problem with your argument, is that for many use cases, the real voltage from a battery is not identical to the nominal voltage. As the current through the battery increases, energy loss due to the internal resistance of the battery will cause the external voltage to go down.

This means that actual Wh will depend on how many amperes the battery has to deliver. When the load is very low, this hardly matters, but when you need to power a high wattage engine/appliance with a small battery, this can be quite significant.


> Using Wh or mWh instead of mAh would make this whole problem go away.

Amp-hours (Ah) has been the 'standard' way to list battery capacity for decades, see for example the boating/marine world. I'm not against changing it to something else, but there's a long-tail of legacy here.

Also, while we're at it, can we change away from listing the brightness of light bulbs in watts: this may made sense with incandescent bulbs initially, but we're not beyond those.


Most lightbulbs in Au display brightness in lumens and power in watts. I think it makes sense to show both.


So who's going to explain to 99.999% of the world why a 10w/h battery that is 5v is incompatible with their 12v device that also takes a 10w/h battery?


No one? Just like no one explains that the 1.2v 1500mah battery isn’t compatible with the 9v 1500mah battery.


The ones saying this is a mistake are scientists and electrical engineers. They know what a w/h represents as a unit of measure. They have the frame of reference and know the equation. So they would rather see a standardized measurement that takes into account all factors in one measurement.

But you don't use batteries like that. Consumers use batteries that are a specific voltage. Watt hours obfuscates this fact. So when a user tries to plug a 1.2v 40w/h battery into a 12v device that takes a 40 w/h battery, they will not know why it does not work.

So we break it out into two separate, and unequally important properties for these consumers. The important property being voltage. If the voltage is the same, the battery can be considered by consumers to be "compatible" with their device. If the mAH is greater, they will have more battery life. Lower, and they will have less battery life.

W/H is insufficient for describing batteries unless you specify the voltage you are working with. People don't care about potential energy unless you tell them it's important. That's why batteries are represented as # of amps per hour @ a specific voltage.

Also, thank you for reminding me of how terrible and sophmoric Reddit is.


If I buy a USB-C PD power bank with 10,000mAh capacity, per the USB spec it can output power at 5V, 9V, 15V, or 20V, depending on which device is plugged in.

Which voltage is the 10,000mAh capacity in reference to?

Most likely it's actually the internal 3.7v cell voltage, which is even less relevant than any of the potential output voltages.

Your argument makes sense for purchasing raw battery cells (li-ion or plain old AAs) but as soon as those cells are integrated into a sealed consumer electronics device (battery pack, smartphone, laptop) it is no longer an interchangeable unit with a known voltage.


USB batteries de facto are considered 5V, so 10.000mah at 5V.


> Consumers use batteries that are a specific voltage.

Consumers use batteries that are a specific form factor. They don't buy 1.5V batteries, they buy AA's. They don't know or care what voltage their computer battery is, they buy a Lenovo T14 battery. Et cetera. I can't think of a common single battery where you can buy the correct form factor but get the wrong voltage.

The only two pieces of information that matter are form factor and wattage. You can derive voltage from the form factor, but you can't derive form factor from voltage. You can derive amperage from wattage and voltage.

Edit: I finally thought of an exception. 6V tractor batteries are form factor compatible with 12V tractor batteries. But those are hardly common in 2023.


This is maybe getting tangential but one of my pet peeves is that the industry doesn't establish standards for large batteries, like lawnmowers and leafblowers. If they were serious about electric transition they would make batteries transferable like AA, AAA batteries etc.

I understand the issues involved and that there are nontrivial obstacles (see: IoT), but it is really frustrating to deal with manufacturer lock-in. I also think it adds to confusion of the sort in the linked piece because there isn't a discussion of standards per se along the lines you're mentioning.

I suppose safety issues are magnified a hundredfold compared to the problems that people have run into with USB product standard mismatch but that's probably another reason why people might actually buy brand-name OEM batteries even if there were standards.

I guess in general I wish there was some open industry discussion of this, even if it meant a white paper where they throw up their hands and decide it's too risky.


There is at least one company making adaptors for power tool batteries [1], so that you can do things like use your Black & Decker drill battery with your Milwaukee circular saw.

They seem to be mostly around $30, but might be worth it for someone with a lot of tools with different batteries who wants to minimize the number of different chargers they have to deal with.

https://powertoolsadapters.com/


LiPo batteries vary a lot from voltage but all look the same and have the same connectors.

These batteries are used a lot in RC cars, drones, etc. Basically anything you want to power where regular batteries wont do the job.


> LiPo batteries vary a lot from voltage but all look the same and have the same connectors.

They don't. LiPo batteries are just lithium batteries with 3.7v average and 4.2v max. If the voltage "varies" as you say, it's because they are arranged that way to achieve a higher voltage for the specific application.

The base cells themselves are all the same voltage. And people who are into RC cars/drones, etc. are usually technically apt enough to understand this and make their own battery "packs".


Correct but you can’t tell from the outside if it’s a 7 or 12 volt pack, without reading the label


By w/h, I'm sure you mean Wh, which is a unit of charge.

(W/h is something completely different, and mostly relevant for how a quickly power source (like a power plant) can meet variation in power loads.)

As for the rest of your explanation, it's actually NOT a good reason to label batteries by Ah or mAh over Wh.

Wh is better for almost any purpose as long as it can be accurately determined. You still have to provide the voltage it is designed for, but it is just as easy to find the Ah's in a battery from the Wh as the other way around, when you know the voltage.

The problem, though, is that the true voltage of a battery isn't constant, but tends to vary depending on what load you connect to it. This is because of every battery having an INTERNAL resistance (R_internal). This internal resistance will (for a basic cell) lead to a reduction of the delivered voltage (as seen by the load) by

  U_load = U_nominal - I*R_internal. 
Also R_internal can increase if the battery is cold, hot or has some defect.

This means that the Ah unit provides a more precise measure for the capacity of a battery than Wh.


> Consumers use batteries that are a specific voltage. Watt hours obfuscates this fact. So when a user tries to plug a 1.2v 40w/h battery into a 12v device that takes a 40 w/h battery, they will not know why it does not work.

This is kind of sort of true for gadgets that take AA or AAA or maybe small lead acid batteries. They need a battery at a specific voltage (except they don’t — NiMH, carbon-zinc, alkaline, and Li (metal) batteries all tend to be interchangeable despite rather different voltages), and they consume roughly constant current.

But many modern devices use fancy power supplies that consume roughly constant power. And, more relevantly to newer devices, how many consumers say “hey, that cool USB charger operates at 16V internally, so I want X mAh?”

(Also, batteries have discharge curves, not constant voltages. But in mAh’s slight defense, the charge delivered from a battery is less likely to vary widely with different rates of discharge than the energy delivered.)


Devices don't say "It needs 40Wh battery"

Devices say "it needs AA battery" or "It needs 18V battery"

>W/H is insufficient for describing batteries unless you specify the voltage you are working with.

Yes. Then do exactly that. Describe Wh and voltage.


There are two important numbers when it comes to batteries. Volts and Amp/hours. Watt/hours is simply a calculated value. Why provide watt/hours and force people to back-convert to amp/hours?

Watt-hours is only useful when comparing batteries of different voltages, which is not a very practical thing to be doing. For batteries of the same voltage, amp/hours is a simpler and easier way to compare.


What voltage is your laptop battery? Your smartphone battery? Your Anker power bank battery?

What voltage are they outputting over USB-C PD?

The Volts and Amp-hours (not amp/hours) metrics make sense when comparing raw battery cells, but when integrated into a device where the voltage is unknown or variable output for charging different devices, it becomes less relevant.

Watt-hours tells you the total energy. A laptop with more Watt-hours has more battery than one with fewer Watt-hours, all in a single unit without needing to multiply things together.


>Why provide watt/hours and force people to back-convert to amp/hours?

Because you don't use amp-hours for ANYTHING.

If you have battery and want to know how long your 3.3V device will run from it, the simplest way to do it is to just measure power your device uses and compare it with Wh of battery.

Doesn't matter whether input is single 3.7V battery or 18V battery pack, all you care for is your device power usage and your power supply efficiency at that voltage.

> For batteries of the same voltage, amp/hours is a simpler and easier way to compare.

It's not simpler and it's exactly as easy to compare, bigger number is better...


Just describe Joules. No need to bring hours into it.


Not everyone likes to think about how many seconds their devices will last. Hours are a lot easier to reason about.


> Consumers use batteries that are a specific voltage.

Nowadays with sealed, non-serviceable batteries, it is not always abundantly clear what voltage the device takes for power (which can be different from its charging voltage) unless it’s in your hand and you can read the specifications that are printed on the back (if it is at all). Breaking out these two datapoints into voltage and amp hour no longer makes sense given that most batteries where consumer-advertised capacity are product specific.

Our relationship with battery powered devices has changed since the days of 1.5V cells that we tossed in at varying amounts (2x = 3V, 4x = 6V, 6x = 9v, etc). Pre-“USB-PD”, it still kind of made sense to use mAh because we knew it was always 5v but that too has changed.

The mistake is not changing to mWh once we stopped describing 5v devices.


> Pre-“USB-PD”, it still kind of made sense to use mAh because we knew it was always 5v but that too has changed.

Really? How many of those devices actually had 5V batteries inside? (Are there even credible 5V Li-poly battery banks?)

Original USB (based on a quick search) specified 5V +- 5%. You’re not getting that by direct connection to a battery, and you also can’t charge the internal battery by direct connection to any common supply. Somewhere in the power brick is a switching converter.

So maybe a 5000mAh USB-A power brick can supply 5000mAh output before conking out, but I doubt it.


> How many of those devices actually had 5V batteries inside? (Are there even credible 5V Li-poly battery banks)?

That’s a good point and revealed that I conflated battery capacity with source-able/usable power. You are correct (as far as I am aware as well) that you cannot use a lithium ion/polymer based battery chemistry and have between 4.75 and 5.25 volts output.

Thinking about it some more, the battery packs seem to represent a special case where the function of the device is to provide power and, as per your point about squeezing out maximum power, we probably need to have two separate power ratings: one for the battery and one for the usable output.


But most consumers probably don't care about the internal pack voltages or whatever, all of that is just implementation details. All they care about is how much more time that battery is going to give their device power. Which if they know how many watts their device uses and how many watthours their battery is good for, they'll roughly know the answer.

For a lot of people, all they really need to know is the watts and watthours. Everything else will just line up in the implementation. If I know my phone uses ~1W on average, and my battery is 18Wh, then I know I'll get roughly 18 hours of life on that battery. If I want to charge the 18Wh battery in two hours, I'll need one that can supply at least 10W after charging losses.


There is no 5V Li-po batteries, LiFePO4 is 3.2V. Nominal per-cell voltage of a battery depends on anode/cathode pairs. Can be tweaked, state of charge dependent, but isn't an adjustable parameter.


> So we break it out into two separate, and unequally important properties for these consumers. The important property being voltage. If the voltage is the same, the battery can be considered by consumers to be "compatible" with their device. If the mAH is greater, they will have more battery life. Lower, and they will have less battery life.

It works exactly the same way if you replace mAh with wh.


Since you have to specify the voltage no matter how you measure energy, because voltage is what primarily determines if a battery will work in most consumer electronics, what units you specify energy in is not relevant as far as consumers comparing batteries go.

If I see that two batteries have the same stated voltage and battery A has higher mAh than B I know that A should last longer in my device.

But that's exactly the same if I see A and B have the same voltage but A has higher mWh. It should last longer.

Even if the energy was specified in some completely ridiculous way such as how much mass that amount of energy could lift a 2 liter bottle of diet Pepsi on the moon I'd just have to look at the two numbers and pick the bigger one.


...your explanation would work way better if the unit dimensions actually worked out. And they don't, since W/h is not a unit of energy, and mAh is not a unit of "current per hour".

And your second paragraph is bogus anyhow: the voltage would be labelled on the batteries despite whether the other metrics is charge (which is what mAh is unit of) or energy (which is what Wh is unit of).

P.S. It is still beyond me why people in general don't comprehend that both "amper" and "watt" are already units of "speed", no need to divide them by time any further. Maybe we should've given km/h its own name, to ease people into such concept? /rant


Joules are a lot harder to reason about than watthours. I've got a 5W device that I run for three hours. How much energy did it use? Easy, 5W * 3 hours = 15Wh. How many joules did it use? 1 J = 1 wattsecond, so 5 * 60 * 60 * 3 = 54,000J = 54kJ.


Stop dividing watts by hours or even seconds, please. 1 Joule is 1 watt sustained for 1 second, that is, 1 watt * 1 second.


I didn't in the actual math. I just wrote with the slash to make it easier to read than wattsecond. You'll see I did do 1 watt * 1 second. Please stop being so pedantic. I edited it for you though.

Either way, I (and most other humans) will probably continue to use watthours instead of joules for most things, as most people will probably agree 5W for 3 hours makes more sense as 15Wh instead of 54kJ. Joules are too small a measurement for a lot of practical applications. Needing to multiply by 60 or 3600 to get to minutes and hours gets frustrating.


...can you please point me at the place in my original comment where I am arguing for using joules instead of watt-hours?


I took your original comment as a complaint about using watthours in general, as watts are already a unit of speed. Implying you'd prefer to not use watthours. I guess I was wrong on reading it that way.


Dividing watts per hours makes perfect sense when making a statement about how fast a lignite coal plant can meet transients in load. ;)


People want runtime estimates. Often for fixed voltages. There comes Ah handy. Stop complaining people do "stupid" things before even Googling what it's for.


But Wh would be just as handy, and then wouldn't need voltages to be fixed in the comparison.

For example, I've got a battery pack on my desk that says its a 10,000mAh battery pack. No additional documentation on it. It has two USB ports for output and one USB port for input. Is that mAh rated at the 5V for USB? Is that mAh rated at the probably 3.6V of the lithium pack? Is it using some other voltage internally? Is it a 36Wh battery or a 50Wh battery or something else entirely? How long is it going to run my 5W load? 10 hours? 7.2 hours? Less? That's a nearly 30% swing in runtime that I don't truly know unless I go measure it, when they could have just stamped how many Wh its good for and I'd know.


> then wouldn't need voltages to be fixed in the comparison. > How long is it going to run my 5W load? 10 hours?

Voltages are fixed. Fixed, passive. Amperage is designer's discretion. You can't make a computer run at 7V because you want to. The computer draws 12V/1A, and you have bunch of 3.7V/2600mAh cells around, so you join 4 in series to get 14.4V, shave off 2.4V through LDO, and get 2.6h runtime(is that 2.4W excess heat there? ew).

Wh? Not handy at all. I don't control what many Watts my device draws. I know the CPU runs at precise 3.3V +/- 200mV and draws up to 20mA at full speed. Not "50 mW at up to 3.3V", no. I know my phone charges at up to 5V/2A or 10W, so what. An airline approved XL sized battery should roughly run the phone for 10 hours give or take few with just the battery and that's accurate as it gets from advertised data.

> Is that mAh rated at the 5V for USB?

It's practically ALWAYS rated at 3.7V because every "Chinese" brands - not always from there or with associated ethnicity - wants figures to be based on maximum indisputable advertisable numbers. That had already eliminated ambiguities years ago. I've seen batteries rated for a mAh capacity on 3.6V advertised nominal voltages as well as those in 5V, those don't exist anymore.

All in all, battery capacity in Wh, not so practical. Maybe it could make some sense for consumer variable voltage power supplies. Definitely not for cells.


> Voltages are fixed. Fixed, passive.

And yet this thing has some internal pack voltage, its putting it out at something like 5V, and then my phone is then turning it into something other than 5V for whatever its internal pack voltage is, which then once again gets changed for whatever its internal circuitry runs at. Doesn't seem very fixed to me.

USB-PD has tons of different voltages it can operate at, all vastly different from whatever pack voltage my phone or laptop run at, vastly different from what the CPU runs at.

Sure, when you're building something and you control all the variables, its easy to say the voltage is fixed. But most consumers aren't in that environment, they're plugging their phone into whatever charger supporting a myriad of voltages and amperages.

And circling around, you can't be certain how many hours my 10,000mAh battery pack will run a 10W load. The only voltage assumed here is USB, 5V, so why would anyone assume 3.7? Why not 2.5? Lets just make up whatever numbers for the internal pack voltage. And you can't tell me about how many times that 10,000mAh battery pack can recharge my phone's battery pack, since we don't know either pack's voltage. But if we knew how many watthours each pack was, we could reason that.


> when you're building something and you control all the variables, its easy to say the voltage is fixed

No, my point is I don't control that variable. I can't get 14.4V microcontrollers, nor 6.5V batteries. Neither can you. I am obliged to power it with 3.280V/0.1-150mA regulated power from 4.2/3.7/2.8 max/typ/min aluminized pouches of wet wipes. Wattage figures therefore won't help me.

USB-PD sure runs at 5/9/12/15/20V. Off of 3.7V li-ion cells, to charge 3.7V li-ion cells. So it's clear how many times more capacity a 20100mAh power bank battery has over 5000mAh phone battery, or a series 6-cell 18650 laptop battery pack(with nominal 2600mAh capacity per cell: 9x2600mAh of 3.7V equivalent capacity).

> And circling around, you can't be certain how many hours my 10,000mAh battery pack will run a 10W load.

37Wh? 3.7 hours, _ish_. Give or take one full hour. I don't even know what capacity of an advertised 10000mAh battery at ~0.3C might be. Battery capacity depends on discharge current. And here again it's current, not discharge energy.


> 3.7 hours, _ish_. Give or take one full hour.

So after assuming a value that's not even printed on it we're not even sure within +-25%. What a great unit of measure for battery storage. When if it was just rated in Wh we'd actually know.

I'd much rather a battery say how many Wh its good for, its nominal voltage, and its peak amps. Its incredibly easy to reason how many watts a phone uses, a laptop uses, a game console uses, a TV uses, a car uses, a blender uses, a kettle uses, etc. Then its easy for me to know how long can this battery potentially run X.


> When if it was just rated in Wh we'd actually know.

No.

The capacity literally changes depending on how fast you draw.

This is especially a problem on lead acids, so they often have couple "n-hour rate capacity" figures to make estimates for your particular use cases. It's less of a problem for li-ion, where C-rates(not coulomb but alphabet cee) is used to plan for cell counts, runtime, current limits and required wire thickness(because wire thickness depends on current, yet again, not total energy through it). It's a non-issue for dry cells because it's among equipment designer's responsibilities to account for.

And that has little to do with Wh vs mAh. It'll be easier to guesstimate more precisely from mAh using experiences and implications, if anything.


> So it's clear how many times more capacity a 20100mAh power bank battery has over 5000mAh phone battery

We don't, because we don't know the pack's battery voltage nor do we know the phone's battery voltage. You're assuming they're both 3.7V, but its not listed so we don't actually know unless we tear it open. In fact, my phone is rated for 3.85V, if the pack was rated at like 3.6V or less that's a decent bit of difference. Over 20,000mAh with that voltage difference we're talking about a discrepancy of 5Wh, which for a device using about a watt means five hours of difference in run time from that tiny discrepancy. Which, as mentioned, would be a good bit closer to a realistic answer if it just listed it in Wh, because personally I don't care what its internal pack voltage is I just care about how long it'll keep my 1W device alive. And even then, theoretically this battery pack can charge my laptop as well, that laptop's battery is definitely not even close to the same voltage as that battery pack. So comparing Ah to Ah there is pretty meaningless.

I've got a battery pack here rated for 5,000mAh. Roughly the same size as my phone, huh. Oh wait, its 56V. So roughly how many times can I charge my phone's battery off this pack? Going by Ah, once! But if I just knew my phone battery was ~17Wh, and we knew the battery was 280Wh, then we can guess we'll get about 16ish charges off of it.

https://www.jackery.com/products/explorer-1000-portable-powe...

Would you honestly state for understanding the capacity of the above device it would be more useful if it gave some rating in purely Ah? We've got no info on its internal pack voltage, that's practically useless to us. Its input is 24V, but is that what its internal pack actually is? Who knows. Clearly 3.7, because that's what we'll assume every pack is! Then its running things at all kinds of voltages: 12V, probably half a dozen profiles of USB-PD, and 110V.

I want to charge my car and I'd like to know how long it'll take. I'll do the back conversions to go away from the useful units here so we're talking just purely amps here, and its a 170Ah battery pack. I'll be pulling 40A from the wall. So roughly four and a quarter hours to charge, right? Nope, we didn't normalize for voltage. Well, we can normalize for voltage if we're just looking at watts. So its 240V 40A = 9,600W. Its a 68,000Wh battery pack. So the answer is a bit over 7 hours.

And sure, you're absolutely right about c-rates. The total useful capacity of the battery is going to be different depending on the current you draw on it, fully agreed. But that same logic still applies if you're talking about runtime and you're doing your math purely in Ah. Running at half or double the amps doesn't necessarily mean you'll get twice or half the life, your unit of measure doesn't change that, so I don't get why you're bringing up concepts like wire thickness when discussing which is the better unit to measure battery capacity in a general sense.

Sure, if we're talking purely about you building a circuit and wanting a battery matched to the circuit exactly, matching the voltage is useful. And when you actually do have voltage fixed, Ah becomes relevant. But in the real world, voltages are all over the place. You don't know what my battery pack here is rated for, you only have a guess its 3.7V. That assumed 3.7V is different from the 3.85 my phone's battery is rated for and is way off what my laptop is, so its not a straight 1 to 1 comparison.


Oh, so you don't know. Battery voltage is an S-shaped function of state of charge. For Li-ions except LiFePO4, it's 3.7 +/-0.5V. Charging circuit takes care of that, and designers take care of current draw so as not to exceed discharge current. No, there's no "3.85V battery". No hidden 5Wh either. Yeah EVs and laptop batteries use Wh because it makes sense there. That's irrelevant(as yet).

Y'all are just bunch of arrogant dicks crying you can't divide mAh with mAh and the whole world has to do your homework just for you. No, you can, and that covers 90% of your needs. That is to say, you don't get more precise than that anyway from just sticker figures.

Now, go back to Reddit. This is more than enough Reddit-speak for a week.


https://www.ifixit.com/products/google-pixel-6a-battery-genu...

Read that label. What does it say? 3.7? Nope!

> https://www.amazon.com/2600mAh-Rechargeable-Replacement-Batt...

Is this a 3.7V battery? Nope!

And I've got a pile of rechargeable batteries here that are 1.5V, 1.2V, and others.

https://www.amazon.com/EBL-Lithium-Batteries-AA-Pack/dp/B0CH...

> Yeah EVs and laptop batteries use Wh because it makes sense there.

And it makes sense there because...?

> Oh, so you don't know. Battery voltage is an S-shaped function of state of charge.

I do know that. You don't need to be rude.


I dont understand why do you supposedly know yet pretend to think there is mysterious 5Wh situation. Just makes no sense. Only one of those can be true at the same time. Same for "3.85V battery". If you know you know.


Because it's not a 3.7V pack, it's a 3.85V pack. I'm at like 78% state of charge and it's still 4.26V. It's obviously not a 3.7V pack.

You yourself even mentioned there were 3.6V packs on the market for a while. So if it is a 3.6V pack, a 10,000mAh pack would be 36Wh. This battery is a 3.85V 4410mAh 16.97Wh. If we ignored the voltage, 10 / 4.41 = 2.26x the energy. But 36 / 16.97 = 2.12x. If that 10,000 was the actual same 3.85V that's 38.5Wh. So with that, there's a 2.5Wh difference between what it would have been if the voltage did match. If it's a 20,000mAh pack, that becomes 5Wh of difference for a pack rated 3.6V versus one rated 3.85.

Which, as I've shown, it is rated for 3.85V. And you've even agreed, there were packs on the market with an internal pack voltage of 3.6V!

> Yeah EVs and laptop batteries use Wh because it makes sense there.

And I reiterate, why does it make sense there? Because you're going to juggle a lot of different voltages and amperages when talking charging and we're relying on charging circuits dealing with it. None of what I've been talking about was about getting a little 3.7V pouch to run a small microcontroller or whatever, I'm talking about what actual consumers will really experience in the wild. So I've got phone with a 4410mAh battery and a wall charger rated for 25W. About how long will it take to charge? Need more info! I've got a battery rated 16.97Wh and a 25W charger. Roughly how long will it take to charge? 17 / 25 = .68 so like about 45min. That's the kind of question most consumers are likely to encounter.


Just a whole lot of nonsense. If you want to splice some 25W charger into your 3.8754568351V battery and blow it up, go do it over on Reddit. Somewhere far enough away from here. You've done nothing meaningful to move the Chesterton's Fence.


Since you seriously seem to doubt that 3.85V batteries can possibly exist, here's some additional reading for you. But I don't know if I can change your mind with any materials, since I've shown you the actual pack for my phone, you can see the nominal rating, but you refuse it. 3.6V is a thing, 3.85V is a thing. I don't understand why someone who clearly does know a bit about batteries seems to refuse the existence of a 3.85V battery.

https://batteryuniversity.com/article/bu-303-confusion-with-...

And sorry, didn't realize it but it's actually a 30W charger I'm plugged into right now. I didn't have to splice anything, it's what came from the factory. Or did you think all phone chargers were also 3.7V?

https://store.google.com/us/product/usb_c_30w_charger?pli=1&...

You really didn't need to be nearly as rude in your comments here.


There is no such thing as a device that takes a 40wh battery. A device will take a battery with a specific voltage, and will last for different amounts of time based on the capacity.


Let's say I'm buying a portable power supply for my phone or laptop. I know my device can take a range of input voltages (for my mac: 5V, 9V, 15V, 20V, 28V; for my pixel: 5V, 9V) and know my device will negotiate with the power supply over USB-C for the correct voltage. Since my device can handle either, I would be equally happy with a power supply that used 5V internally as one that used 9V internally. So I do primarily care about the capacity of the power supply in Watt-Hours (energy).


You're not wrong, but the problem we're hitting is that most mobile tech has divorced product/supply voltage from battery voltage, and the result is a lot of "white lies" where the values stated don't add up to a whole.

My laptop has a 6000mAh battery and will charge from 5V. But my 10,000mAh powerbank will not fully charge it. The question is how to explain this to my mother.

To calculate this properly - you have to do some actual digging to find the nominal battery voltage of the laptop, since it's not any of the USB-PD voltages, and is thus entirely divorced from the laptop's input voltage.

And then you have to know that most powerbank vendors tell you the rating of the cells, not the rating of the product. So they will tell you that it's a 10,000mAh battery, and that it outputs 5V - but these two facts are only honest in isolation. It is not a 50Wh battery, it's a 37Wh battery. Or a 7,400mAh product that is powered by 10,000mAh cells.

Almost anywhere we use USB-PD, stating Wh is a lot more honest to the consumer. If we tell my mother that her laptop is 70Wh and the powerbank is 37Wh, she can infer it'll give her about half a full charge. If we tell her the laptop is 6000mAh and the powerbank is 10,000mAh, she can be forgiven for inferring it'll give her almost two charges.

--

For an example of what I'm calling "white lies": https://www.anker.com/products/a1271

(And to be clear, I'm not blaming Anker for this, I believe it's near-universal. I'm only using Anker as an example because they're a more-trusted brand.)

All the way down the bottom of the page, just above the FAQ, they give the capacity as "20100mAh/72.36Wh". 20100mAh is given 19 times - 72.36Wh is given precisely once. From 72.36Wh / 20.1Ah we get a nominal voltage of 3.6V, not 5V. 3.6V is never otherwise stated. I've never seen any vendor tell you that you're seeing the cell rating, not the product rating - but the better vendors will give you enough information to figure it out yourself.


Your theory might make sense if the voltage was actually specified, but when I last bought a USB-C power bank I could not find that information anywhere, so I did not know if I was comparing like with like.


I’m not sure where you live, but I believe every battery sold in the US legally is required to label their I/O voltage and amps, and capacity voltage, amp hours, watt hours.

At the very least, every wall outlet and power bank I’ve seen has text somewhere that tells what it outputs, with the powerbanks aswell having capacity, both amp hour and voltage. Though I will say the text is rather small.


The description of a typical power bank leaves voltage and watts implicit, as far as I can see https://www.amazon.com/dp/B07S829LBX/

I mean, I can work out what they probably are with a little general knowledge and arithmetic, but it isn’t as clear as you suggest.


In a saner world, it would be implicit in the USB-C. As it is, you know that it supports 5V.

It might also support higher voltages (up to 20V), though if it doesn't specify I wouldn't assume. If you need higher voltage, you'll have to look for a battery that specifies it. (And hope that the package isn't lying.)


Every powerbank vendor I've seen gives the battery rating at 3.6/3.7V, so USB implying 5V gives us misleading values. So a 20,000mAh powerbank is 20*3.6V=72Wh, not 20*5=100Wh.

I've never seen a vendor that specifies 3.6V, the best you can hope for is that they give mAh and Wh so you can derive it yourself. If we need the Wh rating to keep the mAh rating honest, the mAh becomes redundant before the Wh does.


NiMH AA cells are 1.2V, while alkaline AA is 1.5V, and most devices will accept either. So for good switched power supplies Wh is more important, since that will correlate best with battery life.

Of course for linear power supplies or plain resistive loads you'll get different results, so you're right that it's impossible to use a single number to summarize the capacity of a battery.


If we need two numbers anyway why not use volts and w/h instead of volts and mAh?


>They know what a w/h represents as a unit of measure

This is basic middle school physics..


And still "w/h" makes no sense. As explained in other comments watt is already "energy per time", i.e. power, i.e. "1w = 3600 J/h".


You mean W-h, right? Watt-hour, not watts-per-hour as W/h would suggest


You mean Wh right? Not watts minus hours as W-h would suggest.


> Not watts minus hours as W-h would suggest.

That's not what that would suggest to most people.

https://physics.nist.gov/cuu/Units/checklist.html

"A space or half-high dot is used to signify the multiplication of units."

In practice, I'm pretty sure people often use a short dash because a half-high dot is hard to type on a keyboard. I also agree the dash is unnecessary, though.


If the short dash means multiply, what symbol do you use for subtraction?

What would the majority of people answer with this:

2-2 = ?

If I put 2-2 into Google, is it going to return 0 or 4?

If you're going to be pedantic, at least write it properly.


Do you think than a short-term plan is a subtraction of the word 'term' from the word 'short'?

Hyphenated words and phrases are not mathematical subtraction.

> Different authors and different journals utilize different spacing and separations for SI units. When writing out units, a hyphen (-) or a space is usually used between the words to represent multiplication (as in newton-meter or newton meter).

https://www.aje.com/arc/editing-tip-formatting-compound-si-u...

> In both English and in French, when the name of a derived unit is formed from the names of individual units by multiplication, then either a space or a hyphen is used to separate the names of the individual units.

https://english.stackexchange.com/questions/169950/hyphenati...


You can’t add or subtract units, so it’s not a relevant question. The dash gets used for many things. It is also used to hyphenate words, but people don’t ask how you can subtract two words.

> If you're going to be pedantic, at least write it properly.

I didn’t write it. I’m just explaining a commonly-used convention since you seem unfamiliar with it.


If using the - symbol was commonly meaning multiply, wouldn't Google say 4?

Using - to mean multiplication is absolutely not common parlance.


No, because I’m only talking about unit notation, just like hyphenated words do not mean subtraction of the words. Why would Google follow unit notation rules for something that is completely irrelevant to unit notation?

If we’re going to be maximally pedantic, you will notice that “Wh” is also wrong, and should instead be “W h” with a space. That’s what NIST says the standard is. Why are you writing it incorrectly?

It’s almost as if different groups of people have different notational conventions. I already agreed it is nonstandard, but your notation is also questionable according to the standards for unit multiplication. It just happens to be a commonly accepted convention.

Here is an example of Wolfram Alpha handling this dash unit notation: https://www.wolframalpha.com/input?i=%283W-h%29%2F%281h%29+i...

I guess someone should tell Wolfram Alpha they forgot to “subtract the units”, whatever that means.


Yup, that makes perfect sense


> So we break it out into two separate, and unequally important properties for these consumers. The important property being voltage. If the voltage is the same, the battery can be considered by consumers to be "compatible" with their device. If the mAH is greater, they will have more battery life. Lower, and they will have less battery life.

That's a great way of putting it. I tend to agree, but W/h can also do the same.


We really need a 'Nutrition facts label' for electronic devices/batteries.

Usage Facts

Serving size: 1 hour

Servings per charge: 3.5

Total power (watt-hours): 56

Voltage: 12

Amp hours: 4.57


"Total power (watt-hours)"

Nitpick, but Watt-Hours is energy, not power.

Why yes, I am very clever. Thank you for noticing.


Data sheets are a thing. More data doesn't help though, unless the consumer knows what it means. Also, it isn't really even that simple, battery capacity can vary depending on discharge rates. The root of the problem is that batteries are complicated and most people aren't engineers.

https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus...


One problem is that a 18V, 2Ah battery MAY deliver 36Wh if the load has a resistance much higher than the internal resistance of the battery, but may ALSO deliver 18Wh if the load resistance equals the internal battery resistance.

For batteries that need to deliver a high current (meaning the resistance of the load is low), Wh can be imprecise. An example could be a car battery that is used primarily to start the engine.

Apart from such cases, I agree that one should strive to use Wh as labels instead of Ah.


For others -- based on the title I believe OP may have meant to link to this specific comment in the thread:

https://old.reddit.com/r/macbookair/comments/w2eqxs/whats_th...


Explanation:

This is not a good thing - it's an industry mistake. Energy is measured in watt-hours - meaning "a watt for an hour". Watts are (simply put) volts times amps - so, 1 volt at 1 amp is 1 watt; 5 volts at 1 amp is 5 watts; 5 watts for 1 hour is 5 watt-hours. Similarly for amp-hours - 5 amps for 1 hour is 5 Ah (5000mAh).

The problem is, with amp-hours, you don't know the voltage - 20 volts, 2 amp-hours (2000mAh) is 40 watt-hours (Wh). This is where the industry mistake exists - companies labeling things to have a certain number of "mAh" instead of Wh gives completely nonsensical values that can only be relatively compared to each other (sometimes). It's a nightmare for people that actually understand energy capacity and speak and think in watt-hours.

Take that 40Wh / 2000mAh example. Now, imagine it were 5 volts instead of 20 volts - now the same battery becomes 8000mAh. Woaa, so much bigger?? Must be gooder?? No, it's the same battery, just a different voltage - but because you're only looking at one of the two critical numbers (amps, not volts), you see bigger numbers. Given that phones and devices charge at a variety of different voltages, and battery packs have different cell arrangements (in fact, usually 3.7-3.8 volts per cell - what even is the standard voltage reference they're using?), the mAh value has always been meaningless in any context.

Watt-hours, though, are what actually power your devices - laptop batteries hold between 45 and 100 watt-hours, phones between 8 and 15Wh, watches in the <1Wh range, rechargeable mice low 1-2's, power tools in the 60-130Wh (or more) range, and at the very top, EV batteries - which are measured in kWh (kilowatt-hours) - range from 20,000Wh to 150,000Wh. (this is why you never need to worry about charging your phone in your EV... it's like blowing a pinwheel in a hurricane).

I hope the industry stops talking mAh soon - it's seriously getting annoying having to dig through spec sheets to find the actual battery capacity somewhere expressed in watt-hours to know what I'm actually looking at.

https://old.reddit.com/r/macbookair/comments/w2eqxs/whats_th...


Why do they like to tell you mAh, but never power consumption in different states?

They'd probably have to call it "mAh per hour" to avoid people needing a science lesson to read it, but still.

mAh just says "to achieve our battery life we needed this much energy". It's not that much of an interesting spec on it's own, since it would be better to have less energy drain and less mAh, if we could get the same performance, rather than more of both.

It's more useful in power banks, or would be, if they weren't often lying.


I'm also irritated by how the SI prefixes get abused here - saying TEN THOUSAND MILLIAMPERE-HOURS is just stupid when people rarely need to compare that to sub-Ah capacities. Why shouldn't we go further and start using nAh? Storage capacity seems to deserve proper prefixes instead of madness like 100 000 000 kB.


It's not an abuse, just as 2000 kilometres (that's 2 megametres!) or 200 grams (that's 2 hectagrams!) or even 20 milligrams (that's 2 centigrams!) are not. One of the points of prefixed units was that a field could find "base" resolution for their measurements such that most of those could be integer multiplies of some chosen unit.


Its a unit of charge and not capacity.

You have the full charge voltage, and the cutoff voltage where you assume the battery be drained (min operational voltage for your circuit), and the charge that gets you from the first to the latter is the mAh number. I find it more practical than using plain energy (Wh or Joule).


TL;DR: mAh is not a unit of energy. Energy = Capacity * Voltage (mAh * V). Industry should use Wh (or Joules, which is the same unit type), instead of mAh.


And Ah is a measurement of charge, it reduces to Coulombs. Amps is couloumbs per second (Q/t), if you multiply by time again you get (Qt/t) which is just Q, or charge.


In a battery, isn’t voltage the thing that scales easily (just add more series cells) while mAh is a measure of quality of each cell?


You can have a battery cell that is 3v, 100mAh. And another cell that is 6v, 100mAh.

The second cell has double the energy storage as the first cell.

Really mAh is only useful when comparing cells of the same voltage. It became a common metric since so many battery cells in electronics are 3.7v, so you could assume a constant voltage and just compare mAh.

But not all cells are 3.7v, so sometimes that mAh comparison falls down completely and you should really be using watt-hours (milliamp-hours * volts).


If a battery is internally a set of connected cells, then as a battery designer you have the choice of wiring the cells in series or parallel. Need more voltage: series. Need more mAh: parallel. Cell 'quality' is a vague term since some applications may care more about discharge rates or operating temperature range than total capacity.

Usually though, batteries are designed with a particular application in mind and that application will define a voltage that is acceptable, such as ~3.8V for phones or ~19V for laptops. Significantly above this would damage the device. Likewise, below that voltage and the device doesn't function. This is why mAh is a decent proxy for Wh; for a given application the voltage is fixed and so the conversion - for that application - is fixed. mAh for a given application can be compared easily.

Comparing batteries for different applications is harder, but not just because you have to convert mAh to Wh: different applications require different performance characteristics beyond just voltage. You might care about weight, physical dimensions, operating temperatures, discharge rates, charge rates, self-discharge rates, output impedance.


I think it's easier to think of voltage as the speed per thread of a processor and amperage as the max number of threads.

Neither measure the "quality" of the battery.

What DOES have to do with quality, though, is the amount by which voltage falls as the load increases, or in other words, how efficient the battery is.

This is similar to the efficiency of a power supply in a computer.


Nope, current scales more easily. Just add more batteries in parallel.

The problem starts with voltage because now you need each battery or battery set connected to have very close capacity or else it will be "wasted", and you need cell balancer to keep them equally charged.


putting batteries in parallel comes with its own problems, namely that you need far higher current to charge them at a reasonable speed, which is more difficult to scale up effectively than voltage


> which is more difficult to scale up effectively than voltage

For those who don't know, this is because losses scale as current squared, instead of just proportionally like it does with voltage.

For example, charging a power bank with 20W of power at 5V you need to supply 4A of current. If you have a good USB cable with 0.1 Ohm resistance, that results in P = V*I = (I*R)*I = (4*0.1) * 4 = 1.6W of loss in the cable[1].

If instead you could use 20V to charge, then the current would only need to be 1A and hence the loss just (1*0.1) * 1 = 0.1W.

[1]: https://en.wikipedia.org/wiki/Ohm%27s_law


Anyone know why using the wrong unit without enough info became common in the first place?


What is the Wh of a "12V" 1000 AH car battery with internal resistance of 20 milli-ohm?

Try that for starter engines that draw 50A, 100A or 200A

One may think that at 100A, the starter engine will be able to consume 1200W

However, if you pull 100A, the internal voltage used to overcome the internal resistance is 2V, so you really only generate 1000W, or 5/6 of the raw energy stored in the battery.

At 50A, the voltage loss is only 1V, so the start engine gets 550W, or 11/12 of the energy tapped from the battery.

Finally, at 200A, the voltage loss is 4V, meaning that the starter engine only gets 1600W, or 2/3 of the energy stored in the battery.

And this is under relatively good conditions. If the battery is freezing cold, the internal resistance can become much higher.

In all cases, the internal resistance leads to heat inside the battery.

Hopefully, it should be clear that for low resistance / high wattage loads, the Wh metric is not very accurate.


I'm not an expert, but as a (former) drone enthusiast, it's probably because of C values. C values tell you how quickly a battery can be safely discharged, and it's based on the amperage. Specifically, it's a direct multiplier for the amperage. And for batteries in the before times, the voltage was clearly printed on the battery, as was the mAh and C values.


C is a multiple of current but is linear with (nominal) energy, no reason that would force this.


Marketers like big numbers. A 3500mAh li-ion cell is more betterer than the same 12.6Wh cell, because 3500 is a bigger number than 12.6.


It's not clear to me where the mistake is. Almost everything uses Li-ion batteries with nominal voltage of 3.7V, so if it's mAh, 3.7V is assumed. else if Wh, implies raw_terminal_voltage != 3.7. mAh ratings normalized for 3.7V are comparable, and Wh ratings are comparable too. It's also probably not wise to connect mAh rated battery to Wh rated ones(if it ever could be).

In electrical design, electrical current(A) matters to trace widths and wire thickness selection; voltage(V) matters, separately, in matching supported voltage ranges, and also insulation for components. Total wattage(W) do not tell either of the story. It will concern me if a single strand of 32AWG wire was supposed to carry 10kA, or two studs 1.27mm apart would be energized to 10kVDC, but same for 10W of power at unknown current or voltage... I won't be so sure.

mAh is fine. So long you don't mix different techs. Like standard alkaline, NiMH, and Li-ion, in a same bucket. Which you don't anyway.


The mistake is that many (most?) devices use batteries in series as well as parallel. So as soon as someone builds a 2S battery with nominal 7.4v, mAh is useless as a sole datapoint


you missed the point.

the laptop's battery is not 3.7v but more likely to be in the 20-ish volt region because the cells are in series. the assumption of 3.7v is plain wrong.


Laptop batteries are Wh marked always, because there are 4, 6, 9 cell options, always in series. Power banks are 99Wh or fractions thereof due to airline safety requirements.

If you want to charge a laptop with a 4S 11Wh slim pack using a 20100mAh battery with 20V/3A USB-PD out, that's 4 times worth laptop capacity...? Who cares? Is that why you want everything to be Wh marked?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: