Not going to lie, that’s pretty horrible.
> Battery life would increase by 50% if I got PC6 or PC8 idle states. The fan only turns on if I’m doing something intensive like compiling go or scrolling in Slack.
lol. One of the things that really drives me nuts is my computer’s fan turning on when I know really shouldn’t be. I have lived and worked with people for whom having their fan randomly turn on for no reason is completely normal, and I just can’t understand how they can bear it. If this happens to me, you can bet I’m digging through Activity Monitor and killing the culprit before the fans can get fully ramped up.
My fan comment was a joke about Slack's efficiency. Of course compiling a bunch of go code will make the fan turn on. That will use up 100% of your cores on any decent sized project.
My idle times, browsing something like HN, or sitting in #emacs I get near 20 hours on my t450s and about 13 on my x220.
I think low low low idle power consumption is key to long infrequent charges. This is because the second you are not using your computer then it should also not be using power. For example, my t450s without adjusting the screen bigness idles at 2.8W, so tons of idle time with a 97WH battery.
You can't really optimize power for when you are using it. If you are compiling, you want the fastest compile time and use the most power. Same for other things that take power.
I go all weekend not charging my personal laptop and the only times I close the lid is when I am going to bed. Otherwise I leave the lid open, same brightness and walk away for hours and come back and continue where I left off.
So I see your point about people saying "I get 9 hours of battery life." doing nothing, but even 9 hours of doing nothing is fairly low, given I can make a 10 year old laptop get close to 18 hours, and a 4 year old one get near 21 hours "doing nothing".
Would you mind sharing what setup you have that lets you get such long battery life?
But get power top installed. Make sure in the 4th screen in power top and make sure all settings are to "good". I made a systemd unit to do this for me know boot. I also installed tlp and have it set it's settings. Some of those cross over into what powertop does.
I set some things on the i915 driver but that will have to wait until baby watch is over.
Does it for me.
The T450s generation (and the generation before it) use ULV processors (ultra low voltage).
The processors used in the T450s generation have a TLP of 15W.
The processors used in the X220 generation have a TLP of 35W.
On the flip side, the processors in the X220 generation (and in the next generation) are powerful, which is a good thing if you need the power and if you have the laptop plugged most of the time.
Modern T-series still have amazing damn battery life, and I’ve never found anyone else who comes close except Apple, or Android ARM hybrids like the old ASUS Transformer Prime with the double batteries.
I wish to purchase a new battery for it and am looking for a place to buy a battery that
1. will ship to Norway without too high shipping cost, and
2. isn’t going to explode in my face.
Also, mbrumlow, what distro are you running? And did you customize it in anyway aside from using powertop that you mentioned, to get it to consume low amounts of power while idle?
You should be able to optimize and get even higher. I'm actually considering getting a new T480 (since it has the battery expanded slot), but I'm struggling to decide.
Long live the ThinkPad.
Edit: This is with mostly casual use on an Arch Linux setup.
I think the next killer feature I want/look for in a laptop is the ability to see the screen in bright sunlight. It's a gorgeous day today and I would mind spending a few hours by the waterfront coding.
This year's models are doing away with replaceable batteries for the most part. T490 has a built-in 50 Whr battery and no ability to extend or replace that without removing screws. It also gets rid of the 2.5 in drive option, one of the RAM slots (but it adds a soldered one), and it replaces the full size SD with a micro-SD. All of that for a small reduction in weight and thickness.
So, basically, the thinkpad line is gone now, it's just another poor macbook pro knockoff. I expect to see articles like this one for years to come, given that there are almost no options left if you want power, expandability and battery life in a single laptop package.
I do this with an OLPC XO. I also have a Cr-48 with Lubuntu. There's a tradeoff: the XO has a brighter screen but a smaller keyboard.
But it wasn't a billion-dollar opportunity and Pixel Qi are so forgotten they were reported by the media as closed down, although might technically still exist (and might have released their IP to the public, I haven't read too far on it just now).
I think a lot of this might be attributable to just how well power-optimised safari is. I could get 10 hours using safari when my macbook was new, but only 6 using chrome or firefox! Given that safari is speed-competitive with these browsers, it's very impressive.
I've noticed the i7-8550U in my HP Envy can easily drive consumption up to 22W during a parallel compile even though I spend most of my time idling between 5 and 7W while editing text. The extra cores seem to consume a lot of power compared to my older 5th-gen laptop with 2 physical cores.
I also wonder if mobile Ryzen behaves similarly or if Intel just trashed their power/thermals in order to compete on core count.
Unfortunately, it's really hard to find benchmarks with power vs load stats. Most of them seem to be like "7 hours watching a movie in Windows" when I'd rather see "Battery is 55 Wh but computer uses 22W when all cores are 100% and 35W at 100% CPU and GPU". You know, like actual facts about the hardware rather than subjective usage experiences.
This afternoon I upgraded my kernel from 4.15 to 4.18. Then I removed the r8168 module that I'd tried to use to reduce power consumption. Using the open source r8169 module got me PC7 states. Now my X210 idles at 5 watts. I'm guessing I could go lower if I tweaked more, but this is a huge improvement in battery life. I'm glad I took the time to look into it.
I've gotten battery consumption figures as high as 30 watts when running something like `./minerd --benchmark --threads=8`. Intel claims the i7-8550U has a 15W TDP, but it can easily go above that if it has enough cooling.
It's less of a joke and probably just a statement.
Seriously: even something as small as someone adding an animated emoji makes slack eat CPU :-/
(I believe that the 13% CPU nentioned there is a rounded-up 12.5% of the whole CPU, or in other words a full 100% of one of 8 hyperthreads.)
Brian Kernighan's COS333 page links to a goodui.org that
recommends animations (in moderation) but also a bunch of
So there's that.
Yesterday I saw someone here on HN saying they didn't know (something about Windows) because they never used Windows.
Performance is not on the list and probably never will be.
>They have raised so much money and have so many engineers.
> My fan comment was a joke about Slack's efficiency.
Hence the "lol" ;)
I'm impressed; with Xcode and the simulator I'm lucky to get anything longer than 6 hours…
I got the figures from https://en.wikipedia.org/wiki/MacBook_Pro#Technical_specific... and https://en.wikipedia.org/wiki/MacBook_Pro#Technical_specific...
Yeah, but try standing in their shoes: they too are probably wondering how you can bear getting worked up over such small things such as fans going off.
It admittedly sounds like a worse situation to be in (seeing that it means living one's whole life in constant irritation) than to have noisy fans.
Fans spinning up is an indicator of something. Usually an indicator that the machine is under heavy load and you need to watch out for the things that come with machines under heavy load: slower performance, overheating, graphical artifacts, potential crashing, etc.
If my fans spun up when I was not aware my computer was under heavy load, I would believe something was wrong with my computer. If I spent any amount of time researching why my computer was under heavy load and my conclusion was that it was not under heavy load and that the fans just spun up for no reason, I think I'd be rightfully irritated.
Imagine a doorbell that rings randomly even if no one is at the door. In the grand scheme of things it's minor. But it shouldn't be happening, and every time it does I have to get up and walk to the window to see if someone is actually there. Every time the doorbell rings and no one is there, I'm going to get more and more irritated.
I'm not an expert in CPU thermals by any means, but from what I've gathered, in order to eek out a tenth of a gigahertz for their marketing materials (with rapidly diminishing returns because physics), manufacturers usually set Turbo Boost Power Limits 5-10 (or more) watts too high. Since Turbo Boost usually maximizes a single core's frequency and the heat generated increases exponentially, it creates a very concentrated heat spike in the silicon. Even if the CPU heat sink is good enough to passively dissipate that much heat from all of the cores, the turbo boost hot spot forces the fans to spin up early before the CPU knows how long the boost will be needed (otherwise Turbo boost would significantly reduce the lifetime of the CPU). Combined with random scheduled OS tasks that take a split second of turbo to run a process and firmware configured with a minimum fan running time to avoid even more annoying pulsing, these power limits cause many mass market laptops to needlessly spin up their fans all the time.
It's ridiculous but I've been running ThrottleStop/the equivalent on Linux to under-clock every laptop I've owned since Turbo Boost was introduced. A small 10-20% reduction in turbo boost power limits is rarely noticeable unless you have a very specialized and irregular CPU-bound workload but it makes a significant change in the amount of heat it generates and gives the passive dissipation enough time to absorb a boosted workload instead of spinning up the fans in an emergency.
The general advice is to never ever leave the factory configuration in place (if you're not going to install Linux at least reinstall windows)
Hardware manufactures are, in general, not your friends.
What if the "no discernible reason" is just poorly coded software?
The stock firmware is, uh, not good. I ported Coreboot to the second batch boards (https://github.com/mjg59/coreboot/tree/X210_good ) and things improved significantly - I wrote it up at https://mjg59.dreamwidth.org/50924.html
Spotify, Slack, and Discord are major culprits, like if someone posts a single gif (are they animated manually by React?). But even some recipe website on a browser tab might go into 100% CPU usage because part of its ad-serving kit was blocked by ublock.
I consider it (or any app like it) essential for maxing your battery lifespan. Really wish it was built into macOS.
What's nice about having a global CPU graph is that you get used to normal idling levels and will start to notice anomalies.
You're also just closer to the pulse of your computer. What exactly are normal bandwidth consumption levels over the course of a day? What speeds are you normally getting? Which apps and which actions seem to be the hardest on your resources? I think these are just nice things to know about the device you use every day like how you might get used to the sounds and feel of the car you drive every day.
For example, I often see HNers suggest that a good computer confers no benefits for web browsing. Meanwhile, I can say that web browsing is the most resource intensive thing going on in my computer. What your CPU graph as you click around the internet. Or when decoding a high res Youtube video or a muted autoplaying video on some news article. Or scrolling Facebook/Instagram.
I have an older machine that stutters while playing 720p+ Youtube videos unless the CPU is idle, and busier webpages take much longer to render and lock up the UI before I can click around. A better computer can save an impressive amount of time in the long run. It's not for nothing!
MacOS always tracks the energy usage stats so you don't have to keep activity monitor open, just look at the 'Average Energy Impact' column. iStats menu also uses a fair bit of CPU, similar to Activity Monitor, it's just split into a couple processes so it doesn't show as much, not that this is a reason not to use it, I love it and use it daily, it's not a good contrast point with the activity monitor.
Good point about iStat Menus consuming its own resources. It has two processes afaict with its main process staying below 1% CPU. I bet that number sees an increase once you start turning on more HUDs like the temp/fan sensors and jacking up the update frequency though.
If you want to know 'what is using all by battery' after a couple hours using it, you want to know the average power usage by app, which activity monitor gives you. Battery life is the point of this thread, not random lock ups.
> Good point about iStat Menus consuming its own resources. It has two processes afaict with its main process staying below 1% CPU.
I wish we could just get rid of 'cpu %' as a metric on modern machines, or at least scale it based on P-state. Saying process X is taking Y% of the CPU is pointless if you don't also communicate the P-state, and chances are you don't even know which core the process is running. Point is, I'm sitting here almost completely idle and there are plenty of tasks 'using 3-5% of my CPU', of course the CPU is running at 800MHz no where near it's all core peak of 2.7Ghz, much less it's single core peak of 4.5Ghz. What does 1% CPU even mean in this world?
I run Activity Monitor 24/7; it's actually not that bad, especially if it's not visible.
I also got the first Macbook 12" and I would get better battery life running Windows 10 than I would OSX. Windows 10 I would get ~12 hours or so. Windows 10 on the Macbook Pro drained the battery quickly though because you couldn't switch to integrated graphics.
Most MacBooks doesn’t turn their fans on until the aluminum bottom is tenderizing your lap (and/or melting the table). You could configure this on most non-Apple laptops, too, but running the fans earlier keeps temps down and probably extends the hardware lifetime.
If a modern 13.3 screamer with battery life is what you are looking for however, check out the just released Thinkpad x390. It's even more modern and the battery life is a staggering 17-18 hours.
What is hard however, is getting long life out of a performance laptop with a 45-60W CPU when you're capped to a 100Whr battery.
FWIW it runs OpenBSD and is currently at 52C.. When doing a heavy compile or something it can get up to about 80. Mounted to the back of a monitor, can't even see it.
10/10 would recommend.
That's probably a fake or aged battery, or your power management is set up incorrectly. A X62 with a fresh battery lasts a lot longer than 2 hours.
My more recent Macbook pro at work is basically worst though, not sure why.
As for Windows 10, it's in its own class of terrible design - try as they might, the Linux folks are nowhere close to matching it.
I have to bear with my laptop's loud fan, but I would like to replace it with a fanless one in the future.
Carbon series form factor...3 hours at a time.
At a place I used to work, laptops were routinely getting bricked by the anti-virus software. Basically it would run a scan, the laptop would overheat and die.
At some point, it seems like the "cure" can be more dangerous than what it's designed to fix.
> Not going to lie, that’s pretty horrible.
On the next line he says that with a newish kernel he can get 6 hours with the flush battery. That is not bad at all.
> Update (2017-03-17): I managed to get PC7 idle by upgrading my kernel to 4.18 and replacing the r8168 module with r8169. Battery life has increased significantly. I now get 6 hours with the flush battery and 10 hours with the extended battery.
I know nothing will be very significant, but I wish at least my laptop can stay for the whole day on battery.
Now when my fans spin up, I just blame McAfee.
I got a Thinkpad E485 ( wanted to try AMD Ryzen mobile) and with Ubuntu installed, I get about 2.5-3 hours on the battery, which is... I forget now. I think it's 48Wh.
You really should be able to cover a full day’s work on 80Wh.
For desktop PCs, the ATX standard means that the entirety of a high-end gaming PC upgrade often consists of just a new motherboard, CPU, RAM and GPU.
A 2007 Lenovo ThinkPad X61 chassis is not that different to a 1997 IBM ThinkPad chassis (or a 1997 Dell Latitude XPi chassis). If the laptop industry standardized, manufacturers would produce a vast ecosystem of compatible components.
Instead we got decades of incompatible laptops using several different power supply voltages (and therefore ten slightly-differently shaped barrel power plugs), many incompatibly shaped removable lithium-ion batteries, and more expense and difficulty in sourcing parts if and when components break.
A little bit of forward thinking in the late 1990s would have saved a lot of eWaste.
Manufacturers probably don’t want to standardize on the remaining motherboard/graphics/chassis/cooling because a laptop isn’t like an atx computer where you get modularity at the expense of wasted space. A laptop is basically a 3D puzzle with thermal components. Few consumers would buy a laptop with even a little wasted volume or weight, even if it meant better serviceability and upgradeability. Same with phones. We aren’t going to see modular phones beyond the concept stage either.
> Same with phones. We aren’t going to see modular phones beyond the concept stage either.
I disagree. I'm writing this on a Fairphone 2, which I bought for its modularity & because running Lineage OS (or any other OS you choose) doesn't void the manufacturer's warranty. While I'm sure Fairphone's sales are small compared to the broader industry, I think they've shown a market exists for ethical, modular phones. I've seen other Fairphones in the wild here in France, as well as seeing them for sale on used goods sites like leboncoin.fr.
But surplus laptops are in such quantity that I’d be fine replacing my tablet with a laptop.
https://www.bsicomputer.com/products/fieldgo-m9-1760 for example (the first vendor I saw that actually shows prices, as opposed to just request-for-quote)
It starts at nearly $2400 for a low-spec Celeron, and I'm not sure it even has an onboard battery.
What I could see as viable would be a micro-ATX case of similar dimensions, sold as a barebones for like $300-- use the extra volume from not accommodating ATX mainboards to store batteries and charging circuitry, which can be off the shelf because space constraints are minimal. Pop in some reasonably priced desktop components, and you'd have a competent luggable for under $1000.
I'm using a Silverstone ML08 (about 100$ at the time), which is slightly bigger than a PS4 pro but fits a full length, double slot GPU.
The ML05 is even smaller, costs about 50$ but fits only a single-slot GPU.
We don't even have to go that far. Just ensuring that laptops can be serviced by their own users would go a long way to reduce e-waste. i.e. not soldering RAM chips to the motherboard, making it feasible to remove every single part (not gluing the keyboard to the MB for example), etc... instead of pursuing an ever thinner laptop design, which has practically no use.
Here’s an example report that mostly has to do with the production and recycling aspects: https://www.apple.com/environment/pdf/products/notebooks/15-...
When people are done with their MacBooks they don’t just throw them out - they sell them or hand them down to their relatives/kids because they still work well enough, are supported by the manufacturer, are durable and have very high resale value in secondary markets.
Robust engineering, longevity, support, and resale markets do more for the environment than making components user-replaceable.
My old 2011 MacBook Air is still going strong and being used by my mother. If anything goes wrong, she can take it to the Apple store and get help promptly. She still gets software updates, and that thing can STILL be sold for ~$250-300 on eBay, Swappa or Nextdoor. If the machine breaks completely, she can take it into the Apple store to get it properly recycled in almost any part of the world.
That’s what minding the environment looks like. You have to look at the entire lifecycle of the product from the moment the raw materials are sourced all the way to making it easy to recycle when a product is end-of-life.
Two: it's much, much harder to support a repair done on-site with a soldering iron than it is to replace a part. These repairs are much more likely to fail under both normal and unconventional use and then will come back for more repairs--which are themselves, still, expensive to provide.
Three: waste concerns have to factor in what Apple does with the part after they do the swap. (I have no insight into what they do, but your comment ignores this.)
Saying he is my favorite Youtuber is a bit condescending. I mentioned him, because he is a loud proponent of the right to repair movement.
>> These repairs are much more likely to fail under both normal and unconventional use and then will come back for more repairs--which are themselves, still, expensive to provide.<<
If that was true, I am sure Apple would choose to repair parts instead of replacing them. ;)
Of course it's true--everything from "that fan's just going to get dirty again, and faster, because it's been blown out but can't be re-sealed outside a factory" to "that solder joint is being done by somebody making fourteen bucks an hour, boy I hope I'm not relying on that long-term".
Why would a company that makes its money off of selling the closest thing to a unified end-to-end experience take the risk of a dissatisfied customer because of a frustrating defect remediation experience?
The quoted point is an example of a fundamental misunderstanding of how Apple views its customers and how Apple makes its money. But stuff like that is a closely-held truth in the various repair-uber-alles communities on the web regardless of reality. (And then, as 'Operyl notes, your cited YouTubist attempts to shore up his own little slice of community by instilling in them the "enlightened"/"sheep" dynamic. Petty little cult leader, that.)
Sorry that you read some real distaste for that mess as condescension, but not sorry to voice that distaste.
You make it sound like Apple has never done it before.
Case in point: the overheating early 2011 Macbook Pros - a problem experienced by thousands of customers.
Apple basically pretended the problem didn't exist for well over a year (there was a gigantic thread about the issue in the Apple support forums). By the time they did issue their recall (or "repair order", if you want to use Apple's euphemism), a lot of people had already divested their dead Macbook Pros for a loss.
Mine had bricked just after my AppleCare expired, and I wasn't about to spend $500+ to get a replacement logic board (which basically had the same defect, except it was a brand new board. Source: I had replaced my logic board under AppleCare only to have the problem recur within two months). I was lucky that I didn't dispose of my Macbook Pro before the repair order, but I had bought a replacement laptop by the time it was issued (spoiler alert: it was my first non-Apple laptop purchase in a decade).
They also put up barriers to getting the repair order. You had to prove you had the heat issue and that it was causing crashes. Since mine was bricked, it was easy. But a friend of mine (who had two of the affected models) had to jump through hoops at the Apple Store to get his fixed.
Those early 2011 Macbook Pros were mostly high end 15" i7 models, meaning they were not on the lower end of Apple's Macbook Pro line. People paid good money for them. If Apple didn't have their heads in the sand and gave everyone replacements (i.e., a 2012 model, which didn't have heat issues) as the problem occurred, it would have been a rounding error for them. But they didn't do that.
>> fundamental misunderstanding of how Apple views its customers and how Apple makes its money.
Speaking from my one experience - I didn't feel like Apple was interested in my experience at all. While I never considered myself a fanboy, I was very loyal to Apple and totally invested in the ecosystem. After my experience with the 2011 Macbook debacle, I abandoned them completely. It meant writing off a lot of money spent on Mac software, mobile apps, etc.
But if the parts were also user replaceable it would be better for the environment.
Not to mention you don’t want the typical user (forget the HN audience) to replace the components themselves.
Most professional users are on corporate enterprise device plans and you don’t want employees or the IT department replacing components either. It’s far better and cheaper to get the employee back up and running with a new machine while the one in need of repair gets shipped off under enterprise warranty.
Coincidentiall, this is also a machine where you still can swap the SSD. With the help of some Alibaba engineering, you can even use a stock m.2 SSD.
The latest Macbooks are bullshit, you cannot exchange anything.
As a nice bonus it sometimes saves money. I’ve only picked my netbook for CPU (i3-6157U, 64MB L4 cache), GPU (Iris 550, 0.8 TFlops) and display (13.3” FullHD IPS). Upgraded to adequate amount of RAM (16GB) and larger and faster M.2 SSD. Both were too low out of the box, and even today there’re not many small laptops with 16GB RAM.
> Soldered CPUs are unfortunately inevitable on modern ones...
To be fair, even on desktop replacing a CPU on the same motherboard is a pretty niche thing in my experience. Not to say people don't do this, but most of the people I know upgrade both at the same time, either because of incompatiblity or because of substantial gains with the newer MB. So soldering the two together is not as bad as glueing keyboard to the motherboard in my eyes.
The desktop I’m using now had i5-4460 at the time of purchase, eventually upgraded to Xeon E3-1230v3. Only going to upgrade motherboard after AMD releases Zen 2 desktop CPUs.
A family member uses a laptop that initially had i3-3110M. I’ve put i7-3612QM there, it’s comparable to modern ones performance-wise despite 6 years difference, e.g. cpubenchmark.net rates i7-3612QM at 6820 points, i5-8265U at 8212 points (because 35W versus 15).
I agree about glued keyboards. Keyboards are exposed to outside world and also subject to mechanical wear. The only thing worse than that is soldered SSDs. Makes data recovery very hard, and also rate of innovations is still fast for them, SSDs that will become available couple years in the future will be both much faster and much larger, upgrading them regularly makes sense for UX.
So yes, laptops should absolutely be made easier to modify, the components get old really fast and I don't wanna buy the whole thing each time I want an extra bit of RAM or some small part gets broken. It's one of the things that make me steer way clear of Apple stuff.
yes, but very few, and going fewer and fewer as we speak. Even Lenovo which was famous for that ends up soldering RAM in their recent models and making the battery a hassle to replace while it used to be on the outside before.
For an example what I’m talking about, see this for current-gen Intel-based 13.3”: http://h10032.www1.hp.com/ctg/Manual/c05695299.pdf
Update: and if you gonna install Linux, these laptops can always be bought without Windows license. Corporate customers use volume licensing, they don’t need these OEM Windows keys and not willing to pay for them either.
Do you have any number to compare the size of the consumer market vs the Enterprise market?
Also companies aren't always superrational logic machines that have coldly calculated their every more; there's sometimes a lot of collective delusion going on that can leave their consumers in the cold, who then just make do with the best out of a bad lot that's offered. Recall the recent iPhones - suddenly every other phone had a notch even when it served no purpose; or the removal of audio jacks, for example. There was NO consumer preference expressed there, just one company that decided it that way for its own purposes, and others blindly copying it.
If companies spent money telling consumers to value upgradability and not to buy new stuff all the time, then we'd value that more .. but that doesn't sell more stuff, it just helps save the planet, so why bother ....
Capitalism is such a blessing.
No, but I would be shocked if they have never run focus groups for this type of stuff.
So, unless I’m mistaken, cooling the ram and moving to another doesn’t work any more.
Source: I'm working on a DDR4 layout right now, and the memory controller scrambling and swapping functions are documented in publically-available intel datasheets (for example, see https://www.intel.com/content/dam/www/public/us/en/documents... sections 2.1.6 and 2.1.8)
It's likely companies like Microsoft (pre-Surface), peripheral manufacturers (eg, Logitech) and motherboard manufacturers (eg, Gigabyte) would have gladly got on board in that era.
It's likely too late to start this in 2019 (but I may be wrong). Certainly the late-1990s would have been the ideal time for this.
In a way, this is much like the situation with desktops-- Dell and HP was/is big enough to come up with their own custom mainboards and cases, but most smaller shops are going to go ATX.
I suspect part of the reason we didn't see much laptop standardization was that the second-tier manufacturers are weaker in the laptop sector than desktop, as well as being weaker as a whole than they were in 2000 when ATX was becoming a thing.
Outside of a few narrow gaming and workstation niches, there are few use cases where you can't find a suitable big-brand laptop, so the second-tier brands (and the manufacturers that supply them) are in a position of fighting for scraps, not one where they can start promoting the benefits of standardization.
This is likely worsened by the mindset that laptops are unupgradeable-- people bought ATX desktops figuring they'd buy new mainboards in 3 years, but generally assume the laptop is going to be stuck in place.
The reason it hasnt happened in laptops is you would have to compromise size and form
If standards lower margins and make entering easier, that’s what should be regulated for.
A ref i really want to push: "Mutual Aid: A Factor of Evolution" (1902, Kropotkine). There is a whole part mostly about the evolutionary analysis of altruism (the rest is about analyzing several human social orders throughout history: pre-medieval villages, medieval cities and 19c industrial cities).
For very low-power machines you might have tons of internal space free, but more powerful laptops need complex heat management in addition to reducing size and weight. It's only now that we have very advanced fabrication techniques and energy-saving designs that we no longer have to hyper-focus on heat exchange.
If size and heat and weight weren't a factor, you can bet that a standard would have arose to manage interchanging parts. But soldered ram is a good example of why that's just not necessary, and can be counter-productive for reducing cost and maximizing form factor.
Even if you don't want to keep your widescreen DVI display from 2008, the interoperability means that when you drop it off at a e-waste center, it's more likely to be reused in its current state for a few more years, rather than immediately recycled (reduce, reuse, recycle!)
I do agree there is some degree of changes in interfaces overtime (like IDE to SATA to NVMe M.2), but if you build a system for a similar intended use case the changes within any given 5 year period are small. This means the upgrades you do over a 15 year period will go from a 2.5" platter drive to a 2.5" SATA drive, or from 800x600 to 1024x768 resolution display, but not both at the same time (with a different but significant set of components being shared every upgrade)
The standard ATX form factor has been upgraded to reduce size over various years with the vast majority of accepted iterations maintaining the same mounting hole and IO panel locations. I literally have a mini-ITX board sitting in a case I purchased in 1999. This probably fits more into the fear you state in your comment with a reasonably new technology "forced" to consume more space than is necessary, but I think it argues for the opposite by showing that incremental changes to a standard format can allow for wide ranging compatibilities.
For an example, when ATX was altered by squaring the board to the shortest length (microATX), it didn't require a new case or a new power supply to be placed on the market in order to be consumed because it fit within the "too big" ATX case. Then when cases that only fit microATXe became abundant and another incremental change to the motherboard size to DTX, we again didn't have to release new cases or power supplies or IO cards to start to consume this version. It allowed consumers to purchase and use the boards until they decided they wanted to reduce their case size, amortizing the upgrade costs over months instead of requiring larger up front payments.
And that's great, if you're into generic beige boxes.
It's been years since I put together my own IBM compatible computers. But in the time since then, I haven't really seen any innovation in desktops.
Yes, for a while the processor numbers ticked up, but then plateaued. Graphics cards push the limits, but that has zero to do with the ATX standard, and more to do with using GPUs for non-graphics computation.
The laptop and mobile sectors seem to be what is driving SSD adoption, high DPI displays, power-conscious design, advanced cooling, smaller components, improved imaging input, reliable fingerprint reading, face recognition for security, smaller interchangeable ports, the move from spinning media to solid state or streaming, and probably other things that I can't remember off the top of my head.
Even if you think Apple's touchbar was a disaster, it's the kind of risk that wouldn't be taken in the Wintel desktop industry.
All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...? I'm not sure. Even liquid cooling was in laptops in the early part of this century.
Again, I haven't built a desktop in a long time, so if I'm off base I'd like to hear a list of desktop innovations enabled by the ATX standard. But my observation is that ATX is a pickup truck, and laptops are a Tesla.
Desktop is still the primary place for innovation. Laptops use technology that was introduced and pioneered on desktop, then refined until it could fit in Mobile/Laptop. Don't get me wrong, there's probably more work in getting the tech into Mobile than developing it in the first place... But the genesis of the ideas happen on desktop.
Desktop has the opposite mix of freedom and constraints as mobile. Standard internals, but freedom of space. There are dozens of heat-sink manufacturers for PC... Dozens of small teams focused on one problem. There's some variation between chipsets, but nothing that requires major design changes. These teams can afford to innovate... And customers can afford to try new solutions. If the heat-sink doesn't perform, you're out 5% of the total cost. But there's no similar way to try things out for laptops.
For example... Should a laptop combine all of its thermal dissipation into one single connected system or have isolated heat management? It completely depends on usage and thermal sensitivities of the components... It was desktop water-cooling that gave engineers the ability to test cooling GPU and CPU with the same thermal system to determine where to draw the line.
>All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...?
Have you ever built an ATX computer? I assure you, there are plenty of different standard form factor cases out there. The beige box thing was in vogue in the 90s, but today the big trends are sleek black with tempered glass.
And standard form factor desktop does not equal giant tower. You could also do a mini ITX build, a standard that's been around since 2001 for what it's worth.
High DPI displays? This implies high end displays weren't available to desktops first (they were.) A decent CRT could produce much higher DPI than LCDs could (in that era.) Part of the reason why Windows DPI independence sucks is because Microsoft implemented it super early in, without all of the insights Apple had to do it right, and now there's like 4 different DPI mechanisms in Windows.
All in all I'm not sure what really needs "innovating" so badly with desktop form factor. Do we need to solder our RAM to the main board, is that "innovation?"
You kind of say it yourself:
>that has zero to do with the ATX standard,
So would be the case for any form factor standard. It only dictates how things interoperate.
Improved efficiency and the demise of bulky storage devices has created a proliferation of small-form-factor designs. We have two proper standards in widespread use (mini-ITX and mini-STX) and an array of proprietary designs from Intel, Zotac and others. It's now possible to get a fast gaming machine the size of a Mac Mini, or a monstrously powerful workstation that'll fit in a shoulder bag.
Maybe it could evolve to a laptop experience if blocks get powerful enough and somebody develops compatible chasis.
*update: The project Ara was cancelled in 2016 .
Maybe laptops now are mature enough as a product that what you suggest could be feasible but it is too late for business reasons, now.
I considered the Lenovo Carbon X1, but it is pricey, doesn't have a number pad, and is at the ultra-slim form factor of a MBP or other similar notebook form factor.
The Lenovo T580 has the num pad, but the graphics card is the NVIDIA MX150, a mobile but faster version of the GeForce 1030. Not really an issue for me, but my son's Lenovo Yoga came with a 1050 two years ago.
Anyway, I've owned all sorts of notebooks, including MBPs, and have found the Lenovos to be my workhorses, and getting out of my way to get things done. Yes, the battery is only 4 to 6 hours, but for me, even traveling and living all over the world, it has never bit me work wise, only when playing.
the P71 is very bulky (somehow more than how it looks in the pictures) and its case feels a bit cheap (the case is "real" plastic and if you tap on it below the keyboard it really makes the sound of an empty plastic case) => personally I expected a bit more as it's not cheap (the components I chose are basically the cheapest ones with the exception of the 4k panel and the backlit keyboard).
Additionally some weeks ago, while I was typing, the backspace and "t" keys stopped working out of the blue => I switched the laptop off but 1 day later the keys were still not working => I then opened the laptop and extracted the keyboard (veeery easy - compliments Lenovo) to read the model number to order later a spare part, touched a bit the connector cable of the keyboard and after putting all back together they keys magically started working again.
Saying all this just because I have the feeling that overall it could be that in the case of the P71 the quality might be a bit lower than what's otherwise the case for other models. Cheers.
I use FreeCAD, since I have been familiar with it for years. It was once clunkier and had less features, but now it can be used for a lot of the things I need to do.
For a quick start in making a cube, make sure you are in the correct workbench Part or Part Design in the tutorial. The only two issues I have are that it is difficult to interface with clients that use Autodesk, Solidworks, or Rhino products. I have done FEA with the Calculix backend in FreeCAD, and used Paraview to create my FEA pictures for reports. If you know Python, you can even create a cube in the console below. Check out the scripting tutorials. I am not a fan of Python, but I use it in FreeCAD and Blender3D.
Do you use a non-root account?
For normal dongles a search for “Anadol Gold Line Wifi AWL150 Micro 150Mbit/s USB WLAN S” shows the one I use in various bsds/linuxes with no problems.
Both re0/run0 chips.
It all depends too if you are in managed mode or monitor mode. See this: https://github.com/nmap/npcap/releases
Here's some more info. on modes and your capture setup: https://wiki.wireshark.org/CaptureSetup/WLAN/CaptureSetup/WL...