Anyone know why this extremely irritating change had to be made?
Otherwise, what's the point of having those things?
Then it’s a question of how many are impacted and how often, how support for USB-C factors in to the decision to purchase, etc., etc.
It’s a trade-off - they COULD build a laptop which can run everything at 100% 24/7 indefinitely, but it’d be heavier and more expensive, with zero benefits to their target audience of "people who only run at 100% for a few hours per day"
If there was any added weight it would be with the charger, not the laptop so... Incredibly unlikely.
It's way more likely that Apple just didn't design the connector to be able to deliver that amount of power. So they'd have to add a second one or switch connectors/design a new one.
Both would be suboptimal as well.
That’s what I’m talking about too
> If there was any added weight it would be with the charger, not the laptop so…
Whether it’s charger or laptop, that’s still more weight that the end-user needs to carry around in their backpack, which 99.9% of them won’t need
Only too-clever-for-their-own-good-designers like automagic things. No user does.
I don't understand why this would be a design assumption, especially for a portable device.
This isn't even an assumption for running applications on servers, especially the kind that serve up things like web requests. If you have pegged the CPU, your application will already be unstable.
Sure, the server can handle being pegged 24/7 from a power and cooling perspective, but in practical use no sysadmin will allow that to happen for very long.
Isn't this bad design?
It's also not just Apple. Lower wattage Lenovo chargers (including ThinkPad) have two-prong adapters and plugs.
Sure, some of it is due to improved efficiency of the components, but when combined with power drain while on a charger under high loads, it's fairly obvious.
If you use one of these then your laptop will charge more slowly than if you just plugged in the power directly.
* charging block
* male USB A to male USB C cable
EDIT: I just clicked your link; no I'm not using that. The cable is plugged into the charging block on one end and the laptop on the other end.
I game on my Macbook Pro + two monitors without discharge unless I accidentally use my girlfriend's OG Macbook charger instead of the Pro's charger that's twice the brick.
And it can definitely handle Civ 6 (2017 and 2019 MBP).
Incredibly annoying once your laptop suddenly slows down and dies after 5h of gameplay. How is this even acceptable? Regardless whether it's a $400 or $3k laptop, I'd expect it to work when charging. I'm two apps (photoshop, lightroom) away from switching to Linux.
So yeah, we seem to have hit a limit with traditional chipsets. Probably why Apple is going full steam on their own ARM SoCs. The iPhones and iPads out now can do some pretty serious work, just imagine what could be achieved by bringing that power to the Mac.
Reminds me of the early brick-sized mobile phones. Sure you could be wireless, but if you wanted to use it more than few minutes, you had to find yourself a power socket.
Normally, I'd call customer support and assume it's an issue with my device specifically. HN saved me a bunch of annoying phone calls.
Yes, that's exactly what the situation was. I was just trying to confirm the phenomenon of "battery can discharge if the AC adapter is temporarily insufficient"- but the core problem was my fault for sure.
do you consider that to be a faulty behavior, or just par-for-the-course of heavy workloads on mobile platforms?
i'd hate for that to be the new norm, I use the hell out of my mobile computers, I don't need them draining before I even unplug.
My laptop is one of the older magsafe ones, so the people talking about USB-C might be a YMMV kind of thing.
They are usually constructed of layers of battery material, wrapped in kapton tape or an equivalent, with the entire package shrink-tubed and a pigtail coming out the side for the device interface.
One could easily crush the unit by hand, the construction offers little to no protection from physical forces.
From the tear downs I’ve seen, it seems like the primary motivation for the non-removable battery is so they can cram it into every spare piece of space.
Oh right, the laptop looks pretty and the battery can't be fitted in one area without adding some thickness.
Im so sick over this form over function crap that everyone copies.
Making stuff removable is a lot more work; you have to build a way to hold the thing while it's on, you have to put in connectors, etc. With non-removable batteries they can just glue them on & solder the wires.
Not really saying it's a good thing, but I understand the decision.
Though in practice, the magsafe connector wasn't cat- or human proof meaning when one of those gets as much as near the cable it's byebye unsaved work and in worst case byebye boot because of a messed up system. I ended up taping the connector to the laptop to avoid that. As to why I'd run without a battery: it was swollen.
That’s not the case I assume.
You'd want some minimum charge to, at least, safely start up and immediately shut down again. Maybe throw in a little extra to be on the safe side. The higher-performance MacBooks also operate within a an extreme width of power requirements, from about 5W to close to 100W. I'm not sure if they can throttle based on battery status. But if not, you'd need a buffer to accommodate load spikes.
I guess you could criticise them for not trusting you to keep it plugged in. But any UI to clearly communicate that unplugging now might cause data loss is bound to be a horrible sludge. It could also easily fail if you happen to trip over the cable or the power went off.
I have never experienced this behavior on any other laptop I have ever used. When I ran into this for the first time, I was completely appalled, it is a horrible user experience
I would be irritated to no end if this happened to my laptop.
Not making a personal jab at you -- just illustrating the difficultly that everyone who makes a product faces when managing user expectations.
They are annoyed because the phone is unusable for a period even when connected to a charger. Every Android phone I have ever used will work immediately after being plugged into a charger (even without a battery). This has nothing to do with "saving" them from having no phone (in fact, it effectively leaves them with no phone at all for a period of time), and everything to do with the phones inability to power itself directly from a charger.
Tough to make something for customers? Perhaps, but seemingly every other phone manufacturer gets this right.
Maybe other phone manufacturers are cutting off your "on"-time sooner than iPhone is, and you just don't ever see their phones getting to this state. You don't even get to make your 911 call, because Samsung chooses to turn your phone off sooner?
So again, in that case Apple would be giving you more usable time, but you're not happy with them for it.
Really hate it, makes my laptop feel more like a “device” if you know what i means and makes it feel like the charger is insufficient for its job. Almost 20 years of using PowerBooks and even completely dead you could just plug them in and power worked straight away, now you have to do the 10 minute power button dance and the thing doesn’t even have a startup chime to tell you it’s worked.
However, waiting for a initial charge has always been true if you use a smaller power adapter than the one that shipped with your MacBook. For example you use a MacBook Air 40w charger with your Pro
I’m a little bummed to learn that this apparently isn’t the (only) reason.
All of this could be fixed if the device gave me power settings but even third party paid tools that require custom kernel extensions (Volta) still don't work reliably.
My next computer is not going to be from Apple for sure.
I absolutely hate it. It throttles constantly even after undervolting it. I had to do a bunch of black magic to get it to sleep properly (which is evidently happening to every Dell) and eventually gave up on that and just set it to hibernate any time the lids closed (it's 32gb so that adds about 30 seconds to the start up time). I've spent more time tweaking this thing, reading forums and reddit about how to make it perform DECENTLY than I did building my last hackintosh and I don't enjoy that experience ever. When you get past all these issues it's still Windows 10 which I just find to be the most annoying OS I've ever used.
Just got my new MBP yesterday and couldn't be more excited to be back on osx. I do really, really wish my MBP was smaller and a 2 in 1, though.
I have a Dell Precision 7540 that I was having issues getting sleep to work. It would panic on resuming I think because of WiFi but I couldn't fix it. So I enabled "hybrid-sleep" which is like hibernate and sleep together. Before it sleeps, it flushes the RAM to disk. When it wakes up, it tries to boot from RAM and, if successful, it deletes the hibernation data. But if it crashes before it deletes the files, you just reboot and it resumes from disk. So I never "lose" my status but 95% of the time my computer resumes normally.
There are certainly other frustrating bits with Linux, but at least the flexibility to fix the issues is within your grasp. That was my biggest gripe with OSX and Windows. You just aren't always able the fix the issue and it's very mentally painful (for me). :P
I went back to macbooks ~1 month ago. The Dell is still brand new (4 months old, ~4000$ fully spec'ed), but my employer does not want it back, so it is now sitting in a drawer in my office.
I ended up billing my employer for the ~120h I spent on "configuring" the laptop over the 4 months. With that included, the laptop costed my employer about 14.000$.
Now I'm using my personal 2013 macbook air instead, and am finally able to get something done, yet I'm not happy with having to use my own laptop for work.
Had they buy me a macbook for 1.500$ instead, like I requested, that would have saved them ~13.000$ + my lost respect + my continued lost respect for requiring me to use my own 7 year old personal laptop for work.
The people for whom Ubuntu works usually have fewer requirements or are not as particular about things working _just right_.
That’s not to say I haven’t had issues with Arch, but there were fewer and they were easier to figure out.
Now if you want a distro where you can update packages without manual intervention several times a year, then it’s probably best to look past Arch.
> Arch was the best distro I could find that used the latest kernel by default. Audio worked there.
That mirrors the experiences I’ve had. Most distros ship with broken old software and custom patches that make troubleshooting a nightmare. They also don’t have the resources to support the wide variety of hardware out there. Microsoft spends billions on Windows and vendors spend billions of their own to make sure hardware is mostly plug and play.
In Linux land, the mainstream distros ship with fragile duct-taped components that work if you’re lucky but are a complete pain to troubleshoot and are a bigger pain as time goes on and you have major upgrades.
Arch is the only sane distro with wide adoption and it’s a joy to use. I get the latest kernel with its bug fixes and hardware support. Packages are updated on a rolling basis so there are a few instances every year where I need to manually intervene, but not major breakages every 2 years.
Granted, it’s not for everyone. You have to recreate the fragile duct-taped components you get in other distros, which is an investment of time. You will know how and why your computer works though and will not have to resort to the Ubuntu stack exchange (or their other communities) for the magic formula that will fix your issues.
I’m hoping we can change hearts one by one with this message: Arch is not scary; it might solve the problems you’ve been having :D
Laughing in Fedora. Out of curiosity how much "mainstream distros" did you use?
Arch was the best distro I could find that used the latest kernel by default. Audio worked there.
Really? I love my XPS 7390 2-in-1, it's the best laptop I've used and owned by far and that includes Macbooks. Combined with a WD19TB dock (and a useful trick of flipping it around so it's an inverted L) it makes a great work from home setup. Its thermal profile is relatively aggressive by default but it should stay at 15W (and ~65 degree temps) indefinitely. Upping the power limit to 25W (like the Windows / Dell "Ultra Performance" mode does) and it appears it can maintain that too. I mean, Crysis can even run on the thing .
That said, I'd probably lose my mind if I had to run Windows on it. A suggestion for getting good old S3 sleep to work: enable the hack for reenabling S3 sleep in the Windows registry, then disable "Early Signs of Life / Dell Logo" in the BIOS . I'm not sure if this will work on Windows, but it works flawlessly on Arch.
It's been awhile since I played with this but I don't believe I was able to bump my voltage to 25w on Ice Lake. Maybe I'm wrong.
Maybe I'll do a fresh install and give it a try again. I've heard a few comments where peoples 7390 2in1s were running great but on /r/dell I've seen a lot more with complaints like mine.
I did get some improvement by dropping the thermal/power plan from High Performance down to Quiet. That seems to keep the thermals down so it triggers throttling less.
My problem is it throttles constantly and has other issues where if you move it (lift it up) while it's under load it will immediately throttle and I trigger it sitting on my lap a lot.
I bought it hoping I could run a bunch of VMs (thus the 32gb) and do light development work on it but I can barely even draw in Figma without having a rough experience so I completely gave up on developing on it.
This is a really good article https://getpocket.com/redirect?url=https%3A%2F%2Fwww.playerz...
Single core turbo 3.9GHz, maintains indefinitely, temperature hovers around ~75 degrees, fans around 5000RPM
All core turbo: Keeps 3.4GHz and ~45W TDP for around 10 seconds, before dropping to 2.7GHz and ~25W TDP indefinitely. Temps still around ~75, with the fans around 8000 RPM.
That's without any undervolting, but with the processor set for 25W TDP (like the Windows "Ultra Performance" mode). Adding in some undervolting gets the all core turbo up to 2.9GHz.
The only bad thing is that it requires a kernel extension so will stop working when they are deprecated in 10.16. Hopefully Apple introduces a similar feature as there were rumours of a "Pro" mode coming which would ramp up the fans and clock, my hope is they add a complementary "quiet" mode or similar to disable turbo boost.
Pretty much all 15" laptops with 45W Intel CPUs turn into jet engines when under load, at least in my experience.
Also not to defend Apple but I do find the MacBook Pro cools down and gets quieter quicker than the Dell does although that could down to my personal setup (tools, configuration, etc). Could be the more powerful Nvidia dGPU on the Dell although I am talking purely CPU workload however with the shared thermal solution the Dell has it could be a bigger factor than I think?
As long as people buy for sleekness first and thermals second this is the outcome you get.
It is possible to buy thicker/bulkier laptops that do a much better job at passive cooling and so don't get so loud.
Personally I prefer the lighter/slimmer laptops for easier portability and am fine with the trade off of them being louder and thermal throttling a bit under load but I appreciate not everyone feels the same as me.
Apple tunes itself to spin down it's fans later, which is why you're noticing this.
This is because when the average user runs a cpu intensive task it is unlikely you will run another one seconds later, so the cpu can cool off without all the fan noise.
And I've always gotten better battery life on average on MacBooks than other laptops, and in macOS than in Windows on the same Mac.
The problem is when there are bugs (of which there are many) in the other higher-level daemons and third-party apps that get stuck in some task and cause the CPU to unnecessarily shoot up.
For example I try to do things like renice +20 for every Steam process, because it's a bad macOS citizen and suddenly shoots up to 100% CPU for no reason, but it doesn't help until I terminate them. Even the TextInput etc. daemons get randomly wonky and have to be terminated.
The core thermal management of macOS seems to be better than Windows on the same mac device
As an aside the 2019 redesign of the Mac Pro likely has the best thermal design we've seen from Apple by far. I often wonder what an actual mobile workstation laptop from Apple would look like, I'm talking at least three times the thickness, hardy case/screen, great thermal design, and huge battery. It would be niche but fantastic.
Keep in mind Intel marketing as well which advertises CPU with way less heat than they expose in real life.
The way you have this machines is because apple is commercial company and they should follow market demands.
"10 cores", "silent": you can only choose one.
If you want a reliable silent machine, use the proper tool for the job. I.e. mac mini / imac. There is the only way to get proper cooling. You can't have silent cooling in your laptop. Especially if it's "a top spec".
No. It's "10 cores", "silent", "5mm thick": choose two.
Personally I don't see the point in having a laptop thinner than 20mm. I would pay very good money for a 25mm thick, reasonably powerful laptop with super quiet cooling.
The Thinkpad T440s was dead quiet. It was the first generation of thinkpads with a 15W CPU instead of 35W. A current mobile CPU configured to 15W TDP would be powerful enough for basically any task that you would throw at a single computer. Instead of keeping things quiet, manufacturers focus on making laptops thin enough to replace a knife.
On current laptops, the noise level tends to follow the system load very closely. Just putting a bit more thermal mass into the cooling system would allow for a much more steady noise level, which is much less annoying (and the fan would not have to spin up at all on short load bursts like starting a VM).
They focus on what customers want. If they want 10 cores 5mm thick notebooks, they'll get it. It'll be noisy but who cares. It'll be used to like facebook posts anyways. Professions will buy stationary computers for the heavy lifting anyways.
T440 is an awesome machine although X1 is more popular, which is kind of slender version of t440.
I, a professional, have no say in what equipment I use in my profession — my employer does, and they do not allow outside equipment. (I get 1 MBP.)
I don't think this at all uncommon, either; I've only worked for one company so far that allowed personal machines, and that was only briefly while they were so small they weren't purchasing any equipment for the employees yet. (They rapidly outgrew that.)
This isn't a "laptops" issue, this is an "ultra thin" issue. Trouble is most people only want to buy ultra thin laptops and don't realize that that super powerful 8 core cpu isn't any faster than a much less expensive one when there isn't adequate cooling.
Not practical at all, of course, but this is well within the laws of physics. (And people do lesser things all the time, using e.g. those laptop “cooling stands” with clearance and fans built in.)
The recent metal-cased RTL-SDR's from rtl-sdr-blog have this "problem". The metal case actually helps the chip run cooler, but it gets hotter to the touch then the early insulating plastic models.
This can extend your battery a decent amount, and obviously would cut down on fan noise.
Also, it's questionable whether running at lower clocks saves power overall. Apparently it's more efficient to run higher clocks, but for shorter periods of time than it is to run medium clocks for extended periods of time.
Turbo Boost Switcher's website says it can adjust Turbo Boost based on fan speed (only in its Pro version). This is probably the best way to do it: disable Turbo Boost only when it's running for long enough to turn on the fan.
I honestly think a "no fan noise" setting should be a built-in feature.
I have found that "gaming" on my macbook almost requires it as for whatever reason, the system will NOT spool up when intel gpu gets hot.
Happily sprang for the pro version, it's needed to not have to keep authorising it in on every switch.
My reading skills still suggest SMCfanControl fulfills the description given by the parent post
I know Apple has screwed up in the past with throttling iPhone CPU's (bad). At this point they know not to go to those lengths again.
Apple's change did the right thing. It fixed a ton of older phones which before were rendered unusable since they would experience random shutdowns due to naturally degraded Li-Ion batteries.
Every manufacturer should implement the kind of fix Apple did (and I'm sure at this point, they have).
They even mentioned the change in the original release notes for the iOS update.
No, they didn't. The amended the release notes after the fact.
Android has been doing it for years, and it's always been toggleable. If Apple didn't want to implement a toggle that's fine, but they could have easily indicated to the user that their device's battery has degraded, that batteries are replaceable, and that performance can be restored by replacing said battery.
The Battery Health menu should have been implemented from the start, and the controversy probably would have been completely avoided.
My library has a dropbox to refurbish old ones for blind people, for example, but I'm sure dozens of places will be happy to take them off your hands.
Replacing the battery would have fixed the issue at any time. It's not like they were holding it back.
Apple's own front-line staff (ie genius's) were not informed of the throttling. So if you complained of a slow phone you were told you were imagining it, or it was an inevitable part of the newer more complex OS upgrades.
At any rate, you were told there was nothing you could do.
This includes all Android phones, PC laptops, etc.
People have such low expectations of the other products by default that no standards are applied and no quality is expected.
This is the issue. We apply (rightfully so) a far higher standard when assessing the industry leader (Apple) but we apply no standard at all to the competition which frequently gets away with the same kinds of issues (and also issues that are much much worse).
> we apply no standard at all to the competition which frequently gets away with the same kinds of issues (and also issues that are much much worse)
... is total BS. Google and Samsung are never, ever, given a free ride when they mess up. Come on. Let's not be hyperbolic.
> The battery degradation issue is something that affects nearly all Li-Ion battery powered devices. This includes all Android phones, PC laptops, etc.
Irrelevant. It's not a question of whether the throttling is a valid solution to a technical limitation (it is). The problem is that (a) there were no major manufacturers throttling CPU speed due to battery health in phones, tablets or laptops before this. It was not a known practice. And (b) since Apple actively hid the throttling from everyone, including it's own employees, very few people were aware that a simple battery replacement would bring the phone back to like-new performance.
You seem to have gone from outright falsehoods - "They even mentioned the change in the original release notes for the iOS update" to mischaracterisation "Replacing the battery would have fixed the issue... It's not like they were holding it back." and now you've moved on to "what about the other guys!?" deflection.
It puzzles me how some people will so blindly defend Apple without critically looking at the facts. Just admit it was a mistake and looked really, really bad.
Apple has. Why not you?
And for the record, I'm a very happy Apple customer. I've used their computers exclusively since 2005 and their phones exclusively since 2010. I rely on their products and ecosystem to earn a living. But I'm not blind and I'm not stupid.
It was in the note as far as I'm aware. You're saying that as if I'm trying to spread misinformation. I don't appreciate that at all. It's very rude.
> mischaracterisation "Replacing the battery would have fixed the issue... It's not like they were holding it back."
That's not a mischaracterization. You absolutely could have gotten your battery replaced. There were even tons of 3rd party services that were offering battery replacement along with screen replacement. It's not like it was some sort of dark secret or as if they banned you from replacing your battery.
> and now you've moved on to "what about the other guys!?" deflection.
I think people should be angry that other manufacturers weren't doing anything to extend the life of their batteries. I see that as worse than using techniques to manage battery life (which the operating system is doing at all times by the way - same with your GPU).
So while your statement is technically true, it's irrelevant and misleading. Of course battery replacements have always been available! How does that change anything at all? You aren't addressing the central accusation (which is the secrecy). You aren't bringing anything new or interesting to the discussion.
Sorry, if I've been rude. But you've been corrected a couple times without acknowledging it, and continue being slippery by arguing a position without addressing the core accusation. I wouldn't say that's rude but it's frustratingly bad etiquette.
>I think people should be angry that other manufacturers weren't doing anything to extend the life of their batteries. I see that as worse than using techniques to manage battery life
Sure. I've never owned another smartphone and can't comment on how those customers feel. You're probably right. At any rate it's not relevant to how Apple treated its own customers is it?
Try to reply while addressing the central accusation. That the throttling itself is a perfectly valid solution but it was wrong to not inform consumers it was happening. That it was wrong to have customers with $700 phones and $100 AppleCare be told by Apple Genius's that they were imaging the slowdown and nothing could be done... except buy a new phone.
Now, I'm not 100% convinced Apple had nefarious intentions in withholding the info from staff and customers. But, neither you, nor I, will ever know that. All we have to go on, is the facts of what happened and how people were treated. It seems like a black and white, open and shut case to anyone objective. People were lied to, plain and simple. How can you defend that?
Absolutely no one is going to assume their phone is slow due to an old battery.
A. Devices lose battery life per charge over time, eventually powering off randomly because not enough power can be supplied to run the hardware.
B. Devices throttle their CPU over time to keep battery life per charge relatively stable, and avoid powering off.
Apple chose B.
I also don't understand the people who are so upset that Apple didn't say anything - any hardware/software company will make thousands of little tradeoffs like this during R&D, they can't be expected to publicly announce every time they make a decision during development.
I'm surprised to be defending Apple on this, normally most things they do seem pretty anti consumer and leave a bad taste in my mouth. But in this specific case I really don't think they deserve the scorn.
The warning is there for that reason. The phone still works with 3rd party batteries.
Doesn't it mean Apple basically half-assed it ?
Until they come up with something that actually works I'd be grateful to be able to set a lower max charge if I know that the device is mostly used plugged in. And since nobody has been able to come up with anything in over 10 years of dead batteries from devices which are always plugged in, I doubt that this will be resolved anytime soon.
Aren't SSD's over provisioned in this manner? You sell a drive with a terabyte of storage, but actually ship more than a terabyte of chips in the case to accommodate degradation over the life of the device.
Because the last time they tried "silently manage the battery without the user being part of the decision" it didn't go over real well?
Some EVs do something similar: the Nissan Leaf, when the battery gets really low(and I mean it, far past the point where you lost all percentage and mileage indicators), will enter a reduced power mode (or popularly called "turtle mode"). This allows it to eke out a few more miles out of an otherwise mostly dead battery. There is a "turtle" that lights up on the screen, maybe Apple could just do that.
Turns out batteries hate when they are almost empty and you are trying to extract full power from them, voltage drops even more. So there are some compromises that have to be made.
Without power management, when a high demand task came alone and the CPU tried to draw more power, the phone would crash or shut-down.
The firmware update capped performance not to extend the run-time when the batteries were low, but to allow the phone to run reliably when the batteries were worn.
TIL. I thought I was pushing it coming home with 2 miles left…
No offense, but for me it is like the Mac is taking my hand and guiding me threw a mess. However sometimes I know the way better than the Mac but it is not letting me. It actually makes it really hard to „go my way“.
For example the “only trusted developer program“ can be run . I understand why it makes sense for most people but I want an option to decide it mysel who is to be trusted. To be fair I had to google a bit for a terminal command to fix this but either way it leaves a bitter taste. I fear that I may lose this “fix“
Now back to the battery health. Why not allow being transparent? If most of users want the 100% showing then make it default.
I think that’s what boils it down for me. Having no choice with OS X. Now I know some will say that’s how it is for Mac but do you think it is too much to ask for this (really stable) system to support choices?
"Gatekeeper", the ‘only trusted developers’ setting is a setting in System Preferences. Windows also ships with an equivalent setting in the default.
A fun side effect of this is that the tablet can boot while at '0%' battery, and then it will automatically shut off a little bit later to avoid dipping below the safe range - presumably '0%' is actually like 10% or so.
The Verge piece on this says that this is what will happen.
This whole discussion is colored by confusiin between the heavily simplified model presented by the charge indicator with the raw battery properties.
Fun fact: there is no "80%" level either as seen from the battery controller POV, that's just a forecast about how much energy the battery might be able produce if run until the cut-off level (and the 0% level here is also a fair bit above the chemical empty state), and the battery controller is constantly updating its model about how the battery behaves when discharging to keep the forecast accuracy reasonable when the battery ages. When the batteries come from the factory, the calibrations for 0%, 80%, 100% are at certain voltages, and they constantly move throughout the life of the battery. (Maybe not the 0% voltage)
I seem to remember it being something like the hardware capacity is one number but there's a software upsell that you normally need to pay to unlock full capacity. But it may have also been a "we'll push the normal operating parameters to give you this extra boost", don't quite remember.
Besides that, we have proper units for binary multiples: KiB, MiB, and so on. I've only seen KDE use them properly so far, and it's a shame.
Approximately 0 average users know about them, and of the technical users, only the truly irritatingly pedantic are going to use them (which may be why they've shown up in Linux).
Calling it a a "2 TiB disk" when it's actually a 2,000,000,000,000 B disk would be a blatant ripoff.
That seems to me to be quite unfriendly to privacy. And given that I always charge my phone overnight and use my phone’s alarm function, it could simply use that as the target time to finish charging to 100%. No complicated location-based ML needed.
Having the machine upload or make available that information to someone else does. I don't know what the Mac does here.
If your phone never gets past 80%, it could be the battery just needs replacement. Batteries are consumables, and they do degrade after a year or two or three (depending on cycling, temperatures, use, etc). iOS has a battery health indicator giving you a rough estimate of the state... if the health is low, get a repair kit from ifixit and then it's a fun evening activity to open one of those puppies up and see just how tiny everything in there is!
"it's like blowing up a balloon -- as it gets bigger, there's more back-pressure so it gets harder to fill up the last bit."
That's not actually the case with a balloon. :)
If you think about it I'm sure you've noticed that you have to blow the hardest right when you start, and then it gets easier. The reason has to do with the curvature of the balloon and the fact that the same amount of air causes less and less stretching of the material as it fills up.
Instead, they are saying it's dumb because it doesn't slow down when it can charge fast but doesn't need to charge fast.
If your phone is at 40% and you plug it in, it will probably finish charging in about 2 hours. But if you plug it in at bedtime, you don't need it to be done in 2 hours because you'll still be asleep. So why not figure out the lack of urgency, then take 4 or 5 hours and keep the battery temperature lower?
Out of curiosity, have you found a better ELI5 analogy that doesn't bend over backwards with a rube goldberg setup of pipes and waterflow? I've struggled explaining this phenomenon to people buying EV's, where charging behavior become a regular thing people need to figure out.
There's not more beer, but it fills more space temporarily when poured faster.
I have a Sony which does this (XZ1c, stock firmware)
I just want the option to manually move that brick wall down a bit (they're already telling me exactly where it is right now, on iPhones at least, on the MacBook I ask coconutBattery).
I realised the other day that Macs don't give 'battery time remaining' anymore, although I'm not sure when that was removed
Edit: apparently you can still see time remaining in Activity Monitor
I think the real difference is that car manufacturers' business health depends on less maintenance over many years, and phone manufacturers' business health depends on planned obsolescence.
Laptops don't get used like that. People plug them in, then unplug and walk around with them until they either they run out of power or their meetings are over. It's really asking too much to have people plan their usage out. Plus the draw rates aren't predictable. If I can get away with just checking email, my laptop lasts all day. If I have to do a big C++ compile that's 30 watt-hours right there.
I might also add that really only Tesla is doing the charging guidance UI. Other car manufacturers spend a lot of resources to perpetuate the illusion of a battery working like a fuel tank.
root required, obviously.
I assume checking the box to not use this setting will work the same way but I want to make sure I switch it back. Just like how Tesla prompts you intermittently.
The only big issue is that they don't expose it through a friendly UI within Windows 10. It's nonsense that I need to go to the BIOS to tweak these settings.
There's also a DCC package for Linux which can do the same thing. I made a thin Python wrapper which would read a number and set that to the maximum battery charge limit (e.g. $ battery.py 85). Unfortunately I got rid of the xps so I don't have the script, but it was very simple. Hard part was figuring out what the cmdline arguments to the DCC executable were.
Better yet, all devices, phones, tablets, notebook computers, headphones, speakers, ... should automatically stop charging at about 75% and then continue to run directly off of the wall. I used to think that's what all devices do because that just seems like common sense to me.
Maximum charge state is some percentage below the batteries physical maximum, and maximum discharge is state is above the batteries physical minimum.
There's a bit in this Wikipedia article that state charging Li-ion batteries beyond 80% can drastically accelerate battery degradation.
Although they would be compared against last gen macs. Saying "Battery is half the size" would need some marketing polish, implying that it's lighter or something.