Hacker News new | past | comments | ask | show | jobs | submit login
Thinkpad X210 (greer.fm)
1005 points by MYEUHD on March 17, 2019 | hide | past | favorite | 477 comments



> Battery life is a little over 4 hours with the flush battery (55Wh) and 6-7 hours with the extended battery (80Wh).

Not going to lie, that’s pretty horrible.

> Battery life would increase by 50% if I got PC6 or PC8 idle states. The fan only turns on if I’m doing something intensive like compiling go or scrolling in Slack.

lol. One of the things that really drives me nuts is my computer’s fan turning on when I know really shouldn’t be. I have lived and worked with people for whom having their fan randomly turn on for no reason is completely normal, and I just can’t understand how they can bear it. If this happens to me, you can bet I’m digging through Activity Monitor and killing the culprit before the fans can get fully ramped up.


I don't lie about battery life numbers. People like to say, "I get 9 hours of battery life." when they mean that they get 9 hours if they do doing nothing but let their computer idle with the brightness at minimum. 4 hours with a 55Wh battery is an average consumption of 13 watts. That's because my typical workflow involves running a VM containing cassandra & postrgres (among other services) and recompiling go and javascript. My coworkers with 15" MacBook Pros tend to worry more about battery life than I do.

My fan comment was a joke about Slack's efficiency. Of course compiling a bunch of go code will make the fan turn on. That will use up 100% of your cores on any decent sized project.


I get about 9 hours of battery life on my T450s and about 6 on my x220 under load, youtube, vms, coding. That is under load.

My idle times, browsing something like HN, or sitting in #emacs I get near 20 hours on my t450s and about 13 on my x220.

I think low low low idle power consumption is key to long infrequent charges. This is because the second you are not using your computer then it should also not be using power. For example, my t450s without adjusting the screen bigness idles at 2.8W, so tons of idle time with a 97WH battery.

You can't really optimize power for when you are using it. If you are compiling, you want the fastest compile time and use the most power. Same for other things that take power.

I go all weekend not charging my personal laptop and the only times I close the lid is when I am going to bed. Otherwise I leave the lid open, same brightness and walk away for hours and come back and continue where I left off.

So I see your point about people saying "I get 9 hours of battery life." doing nothing, but even 9 hours of doing nothing is fairly low, given I can make a 10 year old laptop get close to 18 hours, and a 4 year old one get near 21 hours "doing nothing".


I've just unplugged my x220 and it is telling me it has 4:36 battery remaining. I only have firefox open and not doing anything else, so that 4:36 is probably quite a significant overhead. It will drop as soon as I do something.. Yup, just fired up emacs and now it is saying 3:44 remaining.

Would you mind sharing what setup you have that lets you get such long battery life?


Currently in charge of th baby.

But get power top installed. Make sure in the 4th screen in power top and make sure all settings are to "good". I made a systemd unit to do this for me know boot. I also installed tlp and have it set it's settings. Some of those cross over into what powertop does.

I set some things on the i915 driver but that will have to wait until baby watch is over.


I have just installed powertop and updated the settings. It doesn't seem to have made any difference... but that's probably because I don't know what I am doing. I'll read up on it further. Thanks for the tip.


powertop --auto-tune

Does it for me.


What battery do you have? The thin one?


Not 100% sure, I'll have to look when I get home this evening. I think it may not be the thin one as it is thicker than the rest of the case, but don't totally know off hand what I should be looking at to determine it..


I don’t think it is possible to get the consumption of T450s on X220.

The T450s generation (and the generation before it) use ULV processors (ultra low voltage).

The processors used in the T450s generation have a TLP of 15W.

The processors used in the X220 generation have a TLP of 35W.

On the flip side, the processors in the X220 generation (and in the next generation) are powerful, which is a good thing if you need the power and if you have the laptop plugged most of the time.


The x220 gets about 80% my T450s, it also seems to compile emacs a few seconds faster... Later today I will try and post some screen shots of power consumption. The x220 for sure can last an entire day on battery.


At my previous job I had a T460p with the full-sized double battery setup, and I routinely got 12 hours actual use with room to spare, and this was on a more or less stock Fedora install without even bothering to do any real power consumption optimizations.

Modern T-series still have amazing damn battery life, and I’ve never found anyone else who comes close except Apple, or Android ARM hybrids like the old ASUS Transformer Prime with the double batteries.


I have horrible battery life on my T520, mainly because the battery is really old.

I wish to purchase a new battery for it and am looking for a place to buy a battery that

1. will ship to Norway without too high shipping cost, and

2. isn’t going to explode in my face.

Any suggestions?

Also, mbrumlow, what distro are you running? And did you customize it in anyway aside from using powertop that you mentioned, to get it to consume low amounts of power while idle?


Track down an OEM Lenovo battery from whatever source you can find. Unfortunately, the aftermarket batteries I've tried all died rapidly, due to cells dying or going out of balance or whatever. Never had an aftermarket last longer than 6 months before it was down to a half hour or something pitiful. Finally sucked it up and bought an OEM battery and two years later it's still giving a couple hours of runtime.


I actually just had the motherboard on my T450s go down. I was regularly getting 10 - 20 hours of development time with the 97WH battery (usually emacs, posgresql, rails and music).

You should be able to optimize and get even higher. I'm actually considering getting a new T480 (since it has the battery expanded slot), but I'm struggling to decide.


What OS are you running? I'm really curious on your setup if you're running Linux. I have the same Thinkpad but it's impossible for me to the the idle wattage like that.



That’s what I’m using, not mbrumlow. Also I just upgraded to 18.10 to get a newer kernel and a newer version of the r8169 module. That unlocked PC7 idle states and increased my battery life by 50%.


Long lasting battery is the feature I appreciate most from a laptop. My T430 with the extended slice battery gives me 7-10 hours, and could probably get another 3+ if I swapped out the CD drive for the Power Bridge battery.

Long live the ThinkPad.

Edit: This is with mostly casual use on an Arch Linux setup.

I think the next killer feature I want/look for in a laptop is the ability to see the screen in bright sunlight. It's a gorgeous day today and I would mind spending a few hours by the waterfront coding.


> Long live the ThinkPad.

This year's models are doing away with replaceable batteries for the most part. T490 has a built-in 50 Whr battery and no ability to extend or replace that without removing screws. It also gets rid of the 2.5 in drive option, one of the RAM slots (but it adds a soldered one), and it replaces the full size SD with a micro-SD. All of that for a small reduction in weight and thickness.

So, basically, the thinkpad line is gone now, it's just another poor macbook pro knockoff. I expect to see articles like this one for years to come, given that there are almost no options left if you want power, expandability and battery life in a single laptop package.


Well if it was a MacBook, everything would be glued-in or soldered, but yes the trend is worrying. I guess the popularity of MacBooks, despite their form-over-function, is what is driving this. But surely they could have kept the T490 like the T480, especially as the T490s exists.


> I would mind spending a few hours by the waterfront coding.

I do this with an OLPC XO. I also have a Cr-48 with Lubuntu. There's a tradeoff: the XO has a brighter screen but a smaller keyboard.


Is a brighter screen what we want though? E-ink type screens are excellent in the sun, but lack gorgeous colors... And I'm almost ok with it.


There’s maybe a billion-dollar opportunity there: maybe someone could invent a hybrid screen that switches from (O)LED to e-ink mode under bright lighting conditions.


That's exactly what OLPC has. And I remember an old Nokia phone having a screen that's both at the same time so that brighter sunlight will just wash out the colors but not the content: https://en.m.wikipedia.org/wiki/Transflective_liquid-crystal...


Somebody did invent a hybrid screen, Pixel Qi, it was used in the Notion Ink Adam tablet from 2010.

But it wasn't a billion-dollar opportunity and Pixel Qi are so forgotten they were reported by the media as closed down, although might technically still exist (and might have released their IP to the public, I haven't read too far on it just now).


The OLPC XO with the sunlight readable screen by Pixel Qi is a wonderful machine to use in bright daylight. I love mine.


T430 doesn't support Ultrabay battery


At least my 2013 MacBook Pro would get 10 hours of battery life with the screen at medium brightness, WiFi on, and using Safari and Word (which is not a particularly light workload given how heavy the web is these days). Working in Emacs/SBCL would result in even longer battery life. Unless you’re rebuilding Chrome from source every few minutes, the compile-edit cycle is mostly typing into your editor with short bursts of activity in-between. Even more so for languages with lightweight compilers like Go and JS.


> which is not a particularly light workload given how heavy the web is these days

I think a lot of this might be attributable to just how well power-optimised safari is. I could get 10 hours using safari when my macbook was new, but only 6 using chrome or firefox! Given that safari is speed-competitive with these browsers, it's very impressive.


What kind of numbers do you see in powertop between high and low load? (After running powertop --auto-tune and powertop --calibrate of course).

I've noticed the i7-8550U in my HP Envy can easily drive consumption up to 22W during a parallel compile even though I spend most of my time idling between 5 and 7W while editing text. The extra cores seem to consume a lot of power compared to my older 5th-gen laptop with 2 physical cores.

I also wonder if mobile Ryzen behaves similarly or if Intel just trashed their power/thermals in order to compete on core count.

Unfortunately, it's really hard to find benchmarks with power vs load stats. Most of them seem to be like "7 hours watching a movie in Windows" when I'd rather see "Battery is 55 Wh but computer uses 22W when all cores are 100% and 35W at 100% CPU and GPU". You know, like actual facts about the hardware rather than subjective usage experiences.


Powertop doesn't show any wattage numbers for me. I looked at gnome-power-statistics and saw that my laptop idles around 9 watts. That's all power consumption including the screen.

This afternoon I upgraded my kernel from 4.15 to 4.18. Then I removed the r8168 module that I'd tried to use to reduce power consumption. Using the open source r8169 module got me PC7 states. Now my X210 idles at 5 watts. I'm guessing I could go lower if I tweaked more, but this is a huge improvement in battery life. I'm glad I took the time to look into it.

I've gotten battery consumption figures as high as 30 watts when running something like `./minerd --benchmark --threads=8`. Intel claims the i7-8550U has a 15W TDP, but it can easily go above that if it has enough cooling.


> My fan comment was a joke about Slack's efficiency.

It's less of a joke and probably just a statement.

Seriously: even something as small as someone adding an animated emoji makes slack eat CPU :-/


I used to wonder why my ThinkPad sometimes ran for 12 hours and sometimes 5. Turns out that Google Docs has an animated cursor that doesn't just blink, it fades in and out using javascript. Animation cost several W on average. UX guys love animations because I dunno they are just psychos.


Reminds me of this: https://news.ycombinator.com/item?id=13940014

(I believe that the 13% CPU nentioned there is a rounded-up 12.5% of the whole CPU, or in other words a full 100% of one of 8 hyperthreads.)


Remarkably similar, yes!


I've always appreciated Windows's animations for minimise and restore. I hate animations that don't do anything for you, though.

Brian Kernighan's COS333 page links to a goodui.org that recommends animations (in moderation) but also a bunch of horrible stuff.

So there's that.


My office decided to move from Jabber (Pidgin / Adium) to Slack. Much of the office upgraded computers. Whether Jabber is negligible, Slack used más two to three Gb RAM. It’s simply the most inefficient software I have to Tun continuously on that PC


They have raised so much money and have so many engineers. Is performance of their app so low priority that no one is working on that?


They have too many engineers to make a small efficient program.


Comment made my day :-)


Their priority was probably shipping a cross-platform app as quickly as possible and gaining adoption. My hunch is they will eventually make native apps for Windows and Mac and drop Linux support.


Dropping Linux support should be getting harder and harder: I've never seen more Linux machines in the office than now.

Yesterday I saw someone here on HN saying they didn't know (something about Windows) because they never used Windows.


Yes. It was first about gaining market share and now it's about integrations to solidify their position against competitors.

Performance is not on the list and probably never will be.


You've answered it yourself:

>They have raised so much money and have so many engineers.


/collapse


FWIW, I "honestly" get about 7-8 hours out of my MacBook Pro when doing light coding (small C/C++ project builds every five minutes, tweaking my website, writing some shell scripts), and 4-5 when doing "heavy" work such as iOS development. I can eke out 10 or more hours if I only read Mail and Hacker News :P

> My fan comment was a joke about Slack's efficiency.

Hence the "lol" ;)


Having a MacBook Pro Late 2016, I can only dream of 4h battery life with the workload you describe. It has a bigger screen though.


Try ditching chrome/ff for safari for most of your workload, I find that consistently gives me a few extra hours of battery, and I always have IntelliJ and VScode open and in use.


Yup, switching from Chrome or Firefox to Safari is the number one thing I recommend to people to improve their battery life on Macs. The less Electron they can have as well, the better, but it's slightly less on an issue because it's not running JavaScript trying to sell you ads.


My work laptop is a mbp 2018 (i9, 32GB). The battery life is abysmal. I typically get a little bit over 2 hours. I'm not going to lie though: I'm always running Atom or VSCode and there's lots of opened tabs in Chrome.


Chrome used to half my battery life on a MacBook Pro. On a brand new machine with a full battery I could program all day in Xcode if I forgot my charger until one day it was empty halfway the day. Finally I figured out it was Chrome that killed it so fast.


> On a brand new machine with a full battery I could program all day in Xcode

I'm impressed; with Xcode and the simulator I'm lucky to get anything longer than 6 hours…


Objective-C really is a lot kinder for your battery I guess. Also it was brand new at the time and I guess I was using a phone for debugging.


Did they reduce the battery size on newer models? My 2015 i7 mbp can get about 3 hours of Civ 6 (in strategic view), or 6 hours of regular dev workload. I remember it lasting longer a few years ago too.


Yes. The previous generation 15" MacBook Pro has a 99.5Wh battery. The 2016 & 2017 15" touchbar MacBook Pro has a 76Wh battery. The 2018 model was upgraded to 83.6Wh. Depending on which version you have, you get either 15% or 25% less battery than the older model.

I got the figures from https://en.wikipedia.org/wiki/MacBook_Pro#Technical_specific... and https://en.wikipedia.org/wiki/MacBook_Pro#Technical_specific...


Yes. Can't remember how much it shrank in Wh though. I had a 2016 which I ultimately sold after 6 months, then bought a 2015 of eBay. I get roughly 25-33% better battery life on the 2015 than I did with the newer model.


I got the joke about Slack. Loved it.


If you throttle the machine instead of using a fan the battery life only changes by less than half (about 25% in my experience) so machines with bad cooling which I feel like the sort of person who would exaggerate battery life would be more likely to use will actually get better battery life when performing more intense tasks.


I was wondering if the comment about slack was a jab. Nicely done!


>One of the things that really drives me nuts is my computer’s fan turning on when I know really shouldn’t be. I have lived and worked with people for whom having their fan randomly turn on for no reason is completely normal, and I just can’t understand how they can bear it.

Yeah, but try standing in their shoes: they too are probably wondering how you can bear getting worked up over such small things such as fans going off.

It admittedly sounds like a worse situation to be in (seeing that it means living one's whole life in constant irritation) than to have noisy fans.


>getting worked up over such small things such as fans going off

Fans spinning up is an indicator of something. Usually an indicator that the machine is under heavy load and you need to watch out for the things that come with machines under heavy load: slower performance, overheating, graphical artifacts, potential crashing, etc.

If my fans spun up when I was not aware my computer was under heavy load, I would believe something was wrong with my computer. If I spent any amount of time researching why my computer was under heavy load and my conclusion was that it was not under heavy load and that the fans just spun up for no reason, I think I'd be rightfully irritated.

Imagine a doorbell that rings randomly even if no one is at the door. In the grand scheme of things it's minor. But it shouldn't be happening, and every time it does I have to get up and walk to the window to see if someone is actually there. Every time the doorbell rings and no one is there, I'm going to get more and more irritated.


It surely is an indicator of something but if you have a modern laptop with factory configuration from one of the generic Windows laptop manufacturers, chances are it's an indicator that the power settings are set up for marketing/benchmarks and not real world use. Every single ultrabook and workstation laptop I've had outside of Apple products have been poorly configured, especially the Intel Turbo Boost settings.

I'm not an expert in CPU thermals by any means, but from what I've gathered, in order to eek out a tenth of a gigahertz for their marketing materials (with rapidly diminishing returns because physics), manufacturers usually set Turbo Boost Power Limits 5-10 (or more) watts too high. Since Turbo Boost usually maximizes a single core's frequency and the heat generated increases exponentially, it creates a very concentrated heat spike in the silicon. Even if the CPU heat sink is good enough to passively dissipate that much heat from all of the cores, the turbo boost hot spot forces the fans to spin up early before the CPU knows how long the boost will be needed (otherwise Turbo boost would significantly reduce the lifetime of the CPU). Combined with random scheduled OS tasks that take a split second of turbo to run a process and firmware configured with a minimum fan running time to avoid even more annoying pulsing, these power limits cause many mass market laptops to needlessly spin up their fans all the time.

It's ridiculous but I've been running ThrottleStop/the equivalent on Linux to under-clock every laptop I've owned since Turbo Boost was introduced. A small 10-20% reduction in turbo boost power limits is rarely noticeable unless you have a very specialized and irregular CPU-bound workload but it makes a significant change in the amount of heat it generates and gives the passive dissipation enough time to absorb a boosted workload instead of spinning up the fans in an emergency.


> if you have a modern laptop with factory configuration from one of the generic Windows laptop manufacturers

The general advice is to never ever leave the factory configuration in place (if you're not going to install Linux at least reinstall windows)

Hardware manufactures are, in general, not your friends.


Turbo Boost power limits have to be changed using the Intel Tuning utility or a similar piece of software that sets the correct registers. Operating systems usually leave those lower level details to the BIOS so installing a new operating system won't do anything unless the system default is set correctly and some bloatware driver is updating it at boot. Usually the manufacturer does it in the BIOS so you have to overwrite it every time you boot; or they do it at the factory, in which case just setting the registers once will be enough (I believe they're nonvolatile?)


If your machine crashes, overheats, or has graphical artifacts at 100% load, it's broken and you should fix it.


If your machine spins up its fans for no discernible reason throughout the day, it's broken and you should fix it.


Most people will categorize crashing, overheating, and graphical artifacts as a more severe problem than "what's that small noise I hear once in a while".


> no discernible reason

What if the "no discernible reason" is just poorly coded software?


That's the definition of a discernible reason. If the CPU is being used hard enough to need more cooling, then the question is answered.


That's too limited a view at the issue. Why is the CPU running that hard? If the user is not doing anything CPU-intensive, yet the fans spin up, clearly the CPU is hot, but why? What is it doing?


Then the PC and the OS is off the hook and the search continues with the software.


It’s not. Using the cpu is a supported feature of normal computers.


If you can say "the fans are spinning up because this piece of software is using the CPU more" then it's not "no discernible reason".


I suspect it's partly a sensory (-integration) issue.


>Not going to lie, that’s pretty horrible.

The stock firmware is, uh, not good. I ported Coreboot to the second batch boards (https://github.com/mjg59/coreboot/tree/X210_good ) and things improved significantly - I wrote it up at https://mjg59.dreamwidth.org/50924.html


first link should have a lowercase x: https://github.com/mjg59/coreboot/tree/x210_good


Does it use the regular thinkpad EC, or is that also something custom ?


It's a custom EC running some pretty shitty firmware


You say that's horrible, but I have never gotten more than around 5 hours from my MacBook Pros (my own and my work's). Maybe I could if the only thing I ran was Safari and iCal, but that's not my use case. If their number is from normal use, I'd consider it normal. :shrug:


iStat Menus (macOS app) gives you a HUD in the global top bar for things like network and CPU usage. It will quickly show you how many random apps and browser tabs will explode with resource usage over the course of the day. Or why your internet seems slow -- some app is saturating your bandwidth somehow.

Spotify, Slack, and Discord are major culprits, like if someone posts a single gif (are they animated manually by React?). But even some recipe website on a browser tab might go into 100% CPU usage because part of its ad-serving kit was blocked by ublock.

I consider it (or any app like it) essential for maxing your battery lifespan. Really wish it was built into macOS.


The default Activity Monitor does all of that plus energy usage. It won't go in the menu bar however.


Activity Monitor itself also consumes a lot of CPU. And you tend to only use it when you've already suspected a problem, like when your fans start running.

What's nice about having a global CPU graph is that you get used to normal idling levels and will start to notice anomalies.

You're also just closer to the pulse of your computer. What exactly are normal bandwidth consumption levels over the course of a day? What speeds are you normally getting? Which apps and which actions seem to be the hardest on your resources? I think these are just nice things to know about the device you use every day like how you might get used to the sounds and feel of the car you drive every day.

For example, I often see HNers suggest that a good computer confers no benefits for web browsing. Meanwhile, I can say that web browsing is the most resource intensive thing going on in my computer. What your CPU graph as you click around the internet. Or when decoding a high res Youtube video or a muted autoplaying video on some news article. Or scrolling Facebook/Instagram.

I have an older machine that stutters while playing 720p+ Youtube videos unless the CPU is idle, and busier webpages take much longer to render and lock up the UI before I can click around. A better computer can save an impressive amount of time in the long run. It's not for nothing!


> Activity Monitor itself also consumes a lot of CPU. And you tend to only use it when you've already suspected a problem, like when your fans start running.

MacOS always tracks the energy usage stats so you don't have to keep activity monitor open, just look at the 'Average Energy Impact' column. iStats menu also uses a fair bit of CPU, similar to Activity Monitor, it's just split into a couple processes so it doesn't show as much, not that this is a reason not to use it, I love it and use it daily, it's not a good contrast point with the activity monitor.


My main point is that a global CPU graph helps you identify acute CPU consumption as it happens which is the main mechanism that helps me identify things like Spotify getting stuck in a spinlock or a browser tab devouring my battery. Seeing "Google Chrome" show up in Activity Monitor's energy chart when you check it periodically isn't as helpful.

Good point about iStat Menus consuming its own resources. It has two processes afaict with its main process staying below 1% CPU. I bet that number sees an increase once you start turning on more HUDs like the temp/fan sensors and jacking up the update frequency though.


> My main point is that a global CPU graph helps you identify acute CPU consumption as it happens which is the main mechanism that helps me identify things like Spotify getting stuck in a spinlock or a browser tab devouring my battery. Seeing "Google Chrome" show up in Activity Monitor's energy chart when you check it periodically isn't as helpful.

If you want to know 'what is using all by battery' after a couple hours using it, you want to know the average power usage by app, which activity monitor gives you. Battery life is the point of this thread, not random lock ups.

> Good point about iStat Menus consuming its own resources. It has two processes afaict with its main process staying below 1% CPU.

I wish we could just get rid of 'cpu %' as a metric on modern machines, or at least scale it based on P-state. Saying process X is taking Y% of the CPU is pointless if you don't also communicate the P-state, and chances are you don't even know which core the process is running. Point is, I'm sitting here almost completely idle and there are plenty of tasks 'using 3-5% of my CPU', of course the CPU is running at 800MHz no where near it's all core peak of 2.7Ghz, much less it's single core peak of 4.5Ghz. What does 1% CPU even mean in this world?


> Activity Monitor itself also consumes a lot of CPU.

I run Activity Monitor 24/7; it's actually not that bad, especially if it's not visible.


When I got the first Macbook Pro Retina in 2012, I would get ~10 hours on OSX. The newest ones I can't get ~4 hours from...

I also got the first Macbook 12" and I would get better battery life running Windows 10 than I would OSX. Windows 10 I would get ~12 hours or so. Windows 10 on the Macbook Pro drained the battery quickly though because you couldn't switch to integrated graphics.


If you tweak your power management daemon to force your CPU to stay in the lowest couple of SpeedStep levels, you can keep it much cooler at the expense of performance. This is effectively undeclocking on the fly. When you do something fancy, your CPU clock frequency will increase, but not enough to heat it to the point where the fans turn on.

Most MacBooks doesn’t turn their fans on until the aluminum bottom is tenderizing your lap (and/or melting the table). You could configure this on most non-Apple laptops, too, but running the fans earlier keeps temps down and probably extends the hardware lifetime.


The key is not in just disabling the fans and letting it all boil, the key is in not producing the heat, because of the undervolt.


Yep. My X220 runs at 800 MHz unless I’m watching a YouTube video, at which point it switches to 1.2 GHz-ish. I don’t let it get much above that level otherwise my battery life and temperatures suffer.


That's because it's a random chinese board that attempts to provide up to date performance but not focused on up-to-date thermals/efficiency.

If a modern 13.3 screamer with battery life is what you are looking for however, check out the just released Thinkpad x390. It's even more modern and the battery life is a staggering 17-18 hours.


It has a 15W CPU in it, it's not a screamer. It's not hard to have extremely long battery life in a laptop, just toss an extremely low power (15W or less) CPU in it and get the battery as close as you can to the somewhat arbitrary 100Whr limit that isn't really a limit but no one will sell a laptop you can't take in the cabin of an airplane.

What is hard however, is getting long life out of a performance laptop with a 45-60W CPU when you're capped to a 100Whr battery.


Slightly off topic; but on fans I recently got a totally fanless mini-pc (zotac) loaded it up with 16gb of ram and a tb ssd and the utter silence in my office is amazing. Since switching off apple (I used the fanless 'macbook' models for a while) I'd normally work with headphones, but I've found the difference to be noticeable.

FWIW it runs OpenBSD and is currently at 52C.. When doing a heavy compile or something it can get up to about 80. Mounted to the back of a monitor, can't even see it.

10/10 would recommend.


You should see my X62, the battery is so bad. I even bought a "brand new" battery and it last about 2 hours. We don't notice it but battery technology has really pushed forward since a decade ago.


> You should see my X62, the battery is so bad. I even bought a "brand new" battery and it last about 2 hours.

That's probably a fake or aged battery, or your power management is set up incorrectly. A X62 with a fresh battery lasts a lot longer than 2 hours.


Where did you buy yours? I bought an old X61 and the battery was crap. I can only find affordable genuine ones on eBay. The ones on Amazon were $200... So it didn't make sense to buy them


I buy 3rd party replacement batteries on Amazon. The original batteries have degraded by now, and replacing the cells is a tricky operation: https://hackaday.io/page/247-replacing-lenovo-laptop-lithium...


It was 4th generation Haswell (4000-series CPUs) that cut power consumption in HALF! In fact Haswell was so awesome Intel didn't make measurably better CPUs until 7th generation(Kaby Lake)!


I have a non-ssd Asus u36sd from 2011 running Windows 7 that still has around 5 hours of battery life with moderate screen brightness, a Jetbrains IDE running alongside a vagrant VM, Firefox, music etc, so I think you should or could do better.

My more recent Macbook pro at work is basically worst though, not sure why.


Even if it was an unused battery it might have been old and degraded. One of the problems with replacing batteries in older devices is that if the battery isn't produced anymore the replacements will not have full capacity either.


Would be awesome for the retro thinkpad scene if someone started making new batteries for the old stuff.


You can just crack open a ThinkPad battery and put in New 18650 cells!


oh? I've tried this before, the battery charge/fuel gauge circuit entered a permanent fault state and none of the tricks I found online worked. It was an x61 battery.


I would love to learn how you did this!


How I failed to rebuild a thinkpad battery?


Yep the fan spinning up when nothing special is happening is infuriating - for me 99% of the time it's Debian's "unattended upgrades" which I now realise I should just turn off since I update/upgrade/dist-upgrade relatively frequently anyway.


Yep, with this unattended upgrade thing I can’t make fun of windows update anymore. It’s horrible.


It's a bit clunky (as is 'apt' itself, tbh), but it should only ever install security updates and targeted bug fixes. You can configure apt relatively easily to have it download updated packages, but leave it to the user to actively install them (check out the apt configuration files in /etc/ - specifically the 'apt.d' folder).

As for Windows 10, it's in its own class of terrible design - try as they might, the Linux folks are nowhere close to matching it.


Yep it should only ever install some high priority security issues and such, and actually I suspect it's not actually doing anything complex itself (kicking off processes that perform the downloads + upgrades, but not actually applying them?) so it seems there's something weird going on :-(


If your fan is spinning up when scrolling in Slack it's likely an indication that Electron (Chrome) is refusing to use the GPU for rendering acceleration. This is likely either due to a driver issue or the driver/gpu being on chrome's blacklist. I had this problem once on a hackintosh and as I recall starting Slack from a terminal with the `--ignore-gpu-blacklist` option fixed it.


Why should I need GPU acceleration to read irc messages.


Almost all the pixels on your screen are rendered through a GPU accelerated pathway, if they weren't even the desktop with no apps open would be rather sluggish.


Because you're using a graphical ui.


and then the GPU fan turns on instead. :p


I only buy fanless laptops now.


I run Linux exclusively on my fanless Chromebook and remote computers, so I guess I kinda do the same thing ;)


What Chromebook do you use if I may ask? Do you use Crouton or did you completely flash Linux on it?


If you can find an Acer c720 i3 it's a superb machine no fan and pretty fast, faster than an old ThinkPad and extremely comfortable to carry! It's still the number one selling Chromebook of all time! I have a 4GB RAM 256 gigabyte 2242 SSD and IPS display in mine! 7 hour battery life (new), 6 hours for most of the design life. Totally trivial to repair!


Personally, I find the Acer C720 to feel and look a bit too "cheap" for my tastes.


I dual booted on the new Samsung Chromebook 3: https://saagarjha.com/blog/2019/03/13/dual-booting-chrome-os...


Do you have a list of fanless laptops that you might recommend? (I only know of the Huawei MateBook X and the Xiaomi Mi Air.)

I have to bear with my laptop's loud fan, but I would like to replace it with a fanless one in the future.


I've had good experiences with some chinese manufacturers (first hand experience, Jumper and Alldocube, heard of Chuwi and Xiaomi), they're putting out good quality very cheap products. I'd just go for metal/aluminum body if you can find it.


Whatever you buy, don't buy the ThinkPad X1 Extreme. Its fans are on all. the. time, and the CPU goes into thermal throttling even when idle.

https://www.google.com/search?q=x1+extreme+fans


At this point, the real P series (p52/72), which used to be w series, are probably the only thinkpads that don't throttle. And those have other problems like the inclusion of an nvidia card and off-center keyboards. They've also devalued the T series now.


If it's about the same as the P50 I picked up, the NVidia card is not the default, so it only factors in if the user selects it. The off-center keyboard is a minor nit, I really wanted the numeric keypad (and love the keyboard in general).


The issue with the nvidia card is the wiring for external monitors. Thinkpads (at least my w530) seem to choose something extremely dumb where only the nvidia card can drive the external display. So you cannot just ignore the nvidia card.


It really is the worst of all worlds. P series performance...for 15-20 seconds.

Carbon series form factor...3 hours at a time.


> lol. One of the things that really drives me nuts is my computer’s fan turning on when I know really shouldn’t be. I have lived and worked with people for whom having their fan randomly turn on for no reason is completely normal, and I just can’t understand how they can bear it. If this happens to me, you can bet I’m digging through Activity Monitor and killing the culprit before the fans can get fully ramped up.

At a place I used to work, laptops were routinely getting bricked by the anti-virus software. Basically it would run a scan, the laptop would overheat and die.

At some point, it seems like the "cure" can be more dangerous than what it's designed to fix.


> > Battery life is a little over 4 hours with the flush battery (55Wh) and 6-7 hours with the extended battery (80Wh).

> Not going to lie, that’s pretty horrible.

On the next line he says that with a newish kernel he can get 6 hours with the flush battery. That is not bad at all.

> Update (2017-03-17): I managed to get PC7 idle by upgrading my kernel to 4.18 and replacing the r8168 module with r8169. Battery life has increased significantly. I now get 6 hours with the flush battery and 10 hours with the extended battery.


With all the hype about the efficiency of ARM processors and Apple thinking of putting their custom designed processors in their laptops, I wonder if we'll ever get better battery life on our laptops.

I know nothing will be very significant, but I wish at least my laptop can stay for the whole day on battery.

https://blog.cloudflare.com/arm-takes-wing/


I (almost) always work where I can plugin - so the "battery" I treat more like an internal UPS. 30 minutes would be fine :)


As a part of our enterprise security improvement project, they removed local admin from all people who don't necessarily need it. They'll replace it with creds that can be unlocked for a short period of time, but that's not here yet, and honestly, I don't need it bad enough on my laptop to raise a stink.

Now when my fans spin up, I just blame McAfee.


I used to use SMC fan control on my older MacBook Pro to just run the fan at medium at all times. Definitely miss that you can't use it on the new touchbar MacBook pros.


Interestingly, macOS has been the only OS where I have not had this be an issue. 95+% of the time the fans are off, and rouge runaway processes are quite rare (which is not true of say Linux or Windows, in my experience).


Yes, the only time I hear the fans on my MacBook is when I boot into Windows.


Sigh...

I got a Thinkpad E485 ( wanted to try AMD Ryzen mobile) and with Ubuntu installed, I get about 2.5-3 hours on the battery, which is... I forget now. I think it's 48Wh.


You could drive about half a mile in a TM3 on that bigger battery (with heat and AC off).

You really should be able to cover a full day’s work on 80Wh.


What I wouldn't do for 4 hour battery life.


What computer do you have, and what are you running on it?


acer aspire v17 nitro. ubuntu 18. Has both SDD and HDD. Four years old now, holding out till thanksgiving deals to go to a smaller laptop with longer battery.


I used Arch Linux on a Dell laptop and my fan is pretty predictable. I have never used Windows so I am curious. Does the fan really turns on when you don't expect it ?


This is a good time to bring up the fact there was never an industry-wide standardization effort for laptops. A standard form-factor means components would be re-usable between upgrades: the laptop case, power supply, monitor, keyboard, touchpad could all be re-used without any additional effort. This improves repairability, is much better for the environment, and means higher-end components can be selected with the knowledge that the cost can be spread out over a longer period.

For desktop PCs, the ATX standard means that the entirety of a high-end gaming PC upgrade often consists of just a new motherboard, CPU, RAM and GPU.

A 2007 Lenovo ThinkPad X61 chassis is not that different to a 1997 IBM ThinkPad chassis (or a 1997 Dell Latitude XPi chassis). If the laptop industry standardized, manufacturers would produce a vast ecosystem of compatible components.

Instead we got decades of incompatible laptops using several different power supply voltages (and therefore ten slightly-differently shaped barrel power plugs), many incompatibly shaped removable lithium-ion batteries, and more expense and difficulty in sourcing parts if and when components break.

A little bit of forward thinking in the late 1990s would have saved a lot of eWaste.


Some parts such as batteries, storage, ram etc should at least be a standardized.

Manufacturers probably don’t want to standardize on the remaining motherboard/graphics/chassis/cooling because a laptop isn’t like an atx computer where you get modularity at the expense of wasted space. A laptop is basically a 3D puzzle with thermal components. Few consumers would buy a laptop with even a little wasted volume or weight, even if it meant better serviceability and upgradeability. Same with phones. We aren’t going to see modular phones beyond the concept stage either.


I generally agree with your comment. However, when you wrote,

> Same with phones. We aren’t going to see modular phones beyond the concept stage either.

I disagree. I'm writing this on a Fairphone 2, which I bought for its modularity & because running Lineage OS (or any other OS you choose) doesn't void the manufacturer's warranty. While I'm sure Fairphone's sales are small compared to the broader industry, I think they've shown a market exists for ethical, modular phones. I've seen other Fairphones in the wild here in France, as well as seeing them for sale on used goods sites like leboncoin.fr.


Batteries (or rather, individual cells) do have a standard: 18650. Unfortunately too thick for the ultra-thin laptops, but the older Thinkpads use them. I suspect safety is the reason why no one makes replacement laptop battery "empty shells" that take 18650s and have the appropriate balancing/protection circuitry to interface with a laptop, but then again you see mobile phone powerbanks being sold this way... go figure:

https://megaeshop.pk/5v-dual-usb-power-bank-box-diy-shell-ca...


I just wish there was a portable all-in-one PC with modular components (since vertical parts don’t need as much testing as horizontal).

But surplus laptops are in such quantity that I’d be fine replacing my tablet with a laptop.


There's always beem a trickle of machines like that. The problem is that they're targeted towards industrial usage and RIOTOUSLY expensive.

https://www.bsicomputer.com/products/fieldgo-m9-1760 for example (the first vendor I saw that actually shows prices, as opposed to just request-for-quote) It starts at nearly $2400 for a low-spec Celeron, and I'm not sure it even has an onboard battery.

What I could see as viable would be a micro-ATX case of similar dimensions, sold as a barebones for like $300-- use the extra volume from not accommodating ATX mainboards to store batteries and charging circuitry, which can be off the shelf because space constraints are minimal. Pop in some reasonably priced desktop components, and you'd have a competent luggable for under $1000.


You can get mini-ITX cases that are smaller than console sized with full sized graphics cards inside... they cost like $300 bucks though :(


Not really, they are mostly around 100 bucks, some have integrated PSUs and might be >200, but 300$ seems like a lot to me.

I'm using a Silverstone ML08 (about 100$ at the time), which is slightly bigger than a PS4 pro but fits a full length, double slot GPU.

The ML05 is even smaller, costs about 50$ but fits only a single-slot GPU.


I might be getting one of these.


> A standard form-factor means components would be re-usable between upgrades

We don't even have to go that far. Just ensuring that laptops can be serviced by their own users would go a long way to reduce e-waste. i.e. not soldering RAM chips to the motherboard, making it feasible to remove every single part (not gluing the keyboard to the MB for example), etc... instead of pursuing an ever thinner laptop design, which has practically no use.


MacBook Pros not being user serviceable at every component level doesn’t mean they’re not environmentally friendly - not by a long shot. In fact, building a device like that might even shorten lifespan in a laptop form-factor, not to mention no one wants to carry around a heavy, clunky machine so it likely wouldn’t sell anyway.

Here’s an example report that mostly has to do with the production and recycling aspects: https://www.apple.com/environment/pdf/products/notebooks/15-...

When people are done with their MacBooks they don’t just throw them out - they sell them or hand them down to their relatives/kids because they still work well enough, are supported by the manufacturer, are durable and have very high resale value in secondary markets.

Robust engineering, longevity, support, and resale markets do more for the environment than making components user-replaceable.

My old 2011 MacBook Air is still going strong and being used by my mother. If anything goes wrong, she can take it to the Apple store and get help promptly. She still gets software updates, and that thing can STILL be sold for ~$250-300 on eBay, Swappa or Nextdoor. If the machine breaks completely, she can take it into the Apple store to get it properly recycled in almost any part of the world.

That’s what minding the environment looks like. You have to look at the entire lifecycle of the product from the moment the raw materials are sourced all the way to making it easy to recycle when a product is end-of-life.


Apparently you haven't seen any of Louis Rossmann videos on Youtube. Let's say your grandma's MacBook stops working because of blown fuse on the motherboard. Something like that would take Louis 5 min to repair, but Apple store would just replace the whole motherboard and charge $$$. How is that environmentally friendly.


One: shout out your favorite YouTubist, I guess, but a repair Apple makes is a repair Apple has to support.

Two: it's much, much harder to support a repair done on-site with a soldering iron than it is to replace a part. These repairs are much more likely to fail under both normal and unconventional use and then will come back for more repairs--which are themselves, still, expensive to provide.

Three: waste concerns have to factor in what Apple does with the part after they do the swap. (I have no insight into what they do, but your comment ignores this.)


>> One: shout out your favorite YouTubist, I guess, but a repair Apple makes is a repair Apple has to support. <<

Saying he is my favorite Youtuber is a bit condescending. I mentioned him, because he is a loud proponent of the right to repair movement.

>> These repairs are much more likely to fail under both normal and unconventional use and then will come back for more repairs--which are themselves, still, expensive to provide.<<

If that was true, I am sure Apple would choose to repair parts instead of replacing them. ;)


> If that was true, I am sure Apple would choose to repair parts instead of replacing them. ;)

Of course it's true--everything from "that fan's just going to get dirty again, and faster, because it's been blown out but can't be re-sealed outside a factory" to "that solder joint is being done by somebody making fourteen bucks an hour, boy I hope I'm not relying on that long-term".

Why would a company that makes its money off of selling the closest thing to a unified end-to-end experience take the risk of a dissatisfied customer because of a frustrating defect remediation experience?

The quoted point is an example of a fundamental misunderstanding of how Apple views its customers and how Apple makes its money. But stuff like that is a closely-held truth in the various repair-uber-alles communities on the web regardless of reality. (And then, as 'Operyl notes, your cited YouTubist attempts to shore up his own little slice of community by instilling in them the "enlightened"/"sheep" dynamic. Petty little cult leader, that.)

Sorry that you read some real distaste for that mess as condescension, but not sorry to voice that distaste.


>> Why would a company that makes its money off of selling the closest thing to a unified end-to-end experience take the risk of a dissatisfied customer because of a frustrating defect remediation experience?

You make it sound like Apple has never done it before.

Case in point: the overheating early 2011 Macbook Pros - a problem experienced by thousands of customers.

Apple basically pretended the problem didn't exist for well over a year (there was a gigantic thread about the issue in the Apple support forums). By the time they did issue their recall (or "repair order", if you want to use Apple's euphemism), a lot of people had already divested their dead Macbook Pros for a loss.

Mine had bricked just after my AppleCare expired, and I wasn't about to spend $500+ to get a replacement logic board (which basically had the same defect, except it was a brand new board. Source: I had replaced my logic board under AppleCare only to have the problem recur within two months). I was lucky that I didn't dispose of my Macbook Pro before the repair order, but I had bought a replacement laptop by the time it was issued (spoiler alert: it was my first non-Apple laptop purchase in a decade).

They also put up barriers to getting the repair order. You had to prove you had the heat issue and that it was causing crashes. Since mine was bricked, it was easy. But a friend of mine (who had two of the affected models) had to jump through hoops at the Apple Store to get his fixed.

Those early 2011 Macbook Pros were mostly high end 15" i7 models, meaning they were not on the lower end of Apple's Macbook Pro line. People paid good money for them. If Apple didn't have their heads in the sand and gave everyone replacements (i.e., a 2012 model, which didn't have heat issues) as the problem occurred, it would have been a rounding error for them. But they didn't do that.

>> fundamental misunderstanding of how Apple views its customers and how Apple makes its money.

Speaking from my one experience - I didn't feel like Apple was interested in my experience at all. While I never considered myself a fanboy, I was very loyal to Apple and totally invested in the ecosystem. After my experience with the 2011 Macbook debacle, I abandoned them completely. It meant writing off a lot of money spent on Mac software, mobile apps, etc.


He’s cringe at best, just as bad as the rest of them at worst. He’s playing for the camera, the audience. I wouldn’t take much of what he says seriously, but that’s just me I guess.


Plus he outright lies and then soaks up the attention it brings him: https://www.reddit.com/r/apple/comments/9pow06/louis_rossman...


My son spilled milk on our 2015 MacBook Pro. Apple wanted $1,100 to fix it. it took two separate shippings to New York but Louis rossmann fixed it for $550. You need to wake up, grow up, and grow a brain. Apples excessive greed is real. the fact that you were lucky and haven't dropped or spilled anything on your laptop in The last 5 years is not evidence that apple is a great company!


I’m sorry, what? This is the exact shit I’m annoyed about. He is inciting stuff like this, telling his user base to call anybody that disagrees with them things like “sheep”, it’s even in his logo. I do not agree that the best thing to do is call people you disagree with “asleep” or “sheep,” or to “grow up/a brain.”


Robust engineering, longevity, support, and resale markets do more for the environment than making components user-replaceable.

But if the parts were also user replaceable it would be better for the environment.


"User serviceable" doesn't imply that the user will actually perform any service. I would be willing to posit that for the vast majority of users, an ATX desktop is just as "serviceable" as a Macbook Pro. In the case of the desktop, if it breaks, they take it to their IT department or Best Buy and get a technician to fix it. In the case of the Macbook, they take it to their IT department or the Apple Store and get a technician to fix it. And the Macbook Pro is a darn sight lighter, more portable and more attractive to have sitting on your desk...


This is not taking into account that most people won’t know how to fix the problems that arise from connectors wiggling loose or the replaceable hard drive failing. Additionally, there’s also the problem of the connectors themselves wearing out and breaking: e.g. I have a Lenovo X220 that no longer charges because the power cord connector is broken.


That requires significant trade-offs in durability, weight, and design.

Not to mention you don’t want the typical user (forget the HN audience) to replace the components themselves.

Most professional users are on corporate enterprise device plans and you don’t want employees or the IT department replacing components either. It’s far better and cheaper to get the employee back up and running with a new machine while the one in need of repair gets shipped off under enterprise warranty.


In many cases, a well-designed, durable product can be repairable. While the actual earbuds were not wonderfully well-reviewed, the Samsung Galaxy Buds were both tiny and legitimately repairable: https://www.ifixit.com/Teardown/Samsung+Galaxy+Buds+Teardown...


And they were a shit product. As you yourself admitted. It doesn't help if something is supposedly "repairable" (by the 0.5% of buyers who might be inclined to do such things) if the product is such crap that it gets thrown away after a few weeks.


>My old 2011 MacBook Air is still going strong and being used by my mother.

Coincidentiall, this is also a machine where you still can swap the SSD. With the help of some Alibaba engineering, you can even use a stock m.2 SSD.

The latest Macbooks are bullshit, you cannot exchange anything.


There’re upgrade-friendly laptops on the market. I’ve replaced RAM, disks, keyboard, LCD, CPU, wireless cards in my laptops. Soldered CPUs are unfortunately inevitable on modern ones, but many other components are still replaceable if you pay attention at the time of purchase. Usually voids warranty but I don’t care too much.

As a nice bonus it sometimes saves money. I’ve only picked my netbook for CPU (i3-6157U, 64MB L4 cache), GPU (Iris 550, 0.8 TFlops) and display (13.3” FullHD IPS). Upgraded to adequate amount of RAM (16GB) and larger and faster M.2 SSD. Both were too low out of the box, and even today there’re not many small laptops with 16GB RAM.


Agreed.

> Soldered CPUs are unfortunately inevitable on modern ones...

To be fair, even on desktop replacing a CPU on the same motherboard is a pretty niche thing in my experience. Not to say people don't do this, but most of the people I know upgrade both at the same time, either because of incompatiblity or because of substantial gains with the newer MB. So soldering the two together is not as bad as glueing keyboard to the motherboard in my eyes.


In some cases, upgrading a CPU prolongs useful life of the device.

The desktop I’m using now had i5-4460 at the time of purchase, eventually upgraded to Xeon E3-1230v3. Only going to upgrade motherboard after AMD releases Zen 2 desktop CPUs.

A family member uses a laptop that initially had i3-3110M. I’ve put i7-3612QM there, it’s comparable to modern ones performance-wise despite 6 years difference, e.g. cpubenchmark.net rates i7-3612QM at 6820 points, i5-8265U at 8212 points (because 35W versus 15).

I agree about glued keyboards. Keyboards are exposed to outside world and also subject to mechanical wear. The only thing worse than that is soldered SSDs. Makes data recovery very hard, and also rate of innovations is still fast for them, SSDs that will become available couple years in the future will be both much faster and much larger, upgrading them regularly makes sense for UX.


On a desktop it is possible to replace the motherboard, on a laptop not so much, so not soldering the CPU would at least give you the ability to upgrade the processor to the fastest supported by that motherboard .


True, you have a point there. Of course, standardizing on a few motherboard forms would be even better... one can always dream. ;)


My current weapon of choice is a Lenovo y50-70. Adding RAM is super easy, but I had to replace the keyboard at one point which was a nightmare since it was all glued and had small pins to hold it together, not to mention for some reason you have to disassemble absolutely everything before you actually get to it. In the end I basically just tore the thing out semi-carefully and the new one is just "pinned-in" (the glue isn't even needed). Another adventure was the screen frame which was breaking more and more each time I opened it. For that I drilled holes in a few places and bolted it together with some really small nuts so it still closes fine. It was annoying but a fun experience, got me over my hardware tweaking anxiety for good. I doubt it gets crazier than drilling holes in your laptop.

So yes, laptops should absolutely be made easier to modify, the components get old really fast and I don't wanna buy the whole thing each time I want an extra bit of RAM or some small part gets broken. It's one of the things that make me steer way clear of Apple stuff.


> There’re upgrade-friendly laptops on the market.

yes, but very few, and going fewer and fewer as we speak. Even Lenovo which was famous for that ends up soldering RAM in their recent models and making the battery a hassle to replace while it used to be on the outside before.


Enterprise market is huge. Consumers like thin and shiny things, corporations don’t care, but they employ people with full-time job being counting expenses. Upgradeable laptops are good for them because they can get exactly right hardware without paying for what they won’t use. They rarely upgrade themselves, vendors do, but unless the laptop is designed upgradeable vendors gonna have hard time serving their enterprise customer’s requests, let alone doing it fast.

For an example what I’m talking about, see this for current-gen Intel-based 13.3”: http://h10032.www1.hp.com/ctg/Manual/c05695299.pdf

Update: and if you gonna install Linux, these laptops can always be bought without Windows license. Corporate customers use volume licensing, they don’t need these OEM Windows keys and not willing to pay for them either.


I doubt that enterprises (or even their vendors) do much upgrading. Instead, they have those machines on a refresh cycle and replace them every three years. They do, however, often prize repairability: if you have a fleet of hundreds of the same model of machine, it's easy to maintain spares of the components most prone to failure/damage.


> Enterprise market is huge.

Do you have any number to compare the size of the consumer market vs the Enterprise market?


I don’t have numbers. But this article https://www.gartner.com/en/newsroom/press-releases/2018-10-1... says the PC market is driven by business PCs, both globally, US, and EMEA.


Consumers prefer super-thin soldered machines, sadly. The proof is in the shopping.


Not always, sometimes you cannot see the proof because a company just doesn't offer any other options. If there was a modern laptop that ran macOS that was user upgradeable, I would absolutely get it in my next upgrade cycle. Alas, there isn't one.

Also companies aren't always superrational logic machines that have coldly calculated their every more; there's sometimes a lot of collective delusion going on that can leave their consumers in the cold, who then just make do with the best out of a bad lot that's offered. Recall the recent iPhones - suddenly every other phone had a notch even when it served no purpose; or the removal of audio jacks, for example. There was NO consumer preference expressed there, just one company that decided it that way for its own purposes, and others blindly copying it.


I'm not by any means knowledgeable on the hardware design front. But I think the notch was a hardware design solution to a problem (I'm not sure what it is but probably to fit more components or save space inside the phone for something) and all the others copied it because it was a clever solution to an existing problem and it didn't appear to affect users much.


Consumers "prefer" what the billions of pounds spent on marketing tells them to prefer.

If companies spent money telling consumers to value upgradability and not to buy new stuff all the time, then we'd value that more .. but that doesn't sell more stuff, it just helps save the planet, so why bother ....


Consumers don't know that they have the option to fix their machines. They are trained to toss devices (not just computers, but also cellphones, TVs, and appliances) instead of taking them to a repair shop.


In the defense of upgrade culture, you ever wonder where those old phones you trade in go? Phone companies have been turning them into profit by shipping them to the developing world. We live in an era where the even the most remote and impoverished places on Earth have, at minimum, a cell phone in their villages. And it's that crappy circa 2000 Nokia you had that plays snake. Now they can call emergency services on demand or keep in touch with long distance loved ones.

Capitalism is such a blessing.


Are you serious? No one is using a 2000 Nokia, even in the third world. Do you think it's cheaper to collect phones in the first world, wipe them, test them, and ship them to the third world (assuming they could even connect to any cellular network) than to mass manufacture $15 plastic Androids?


Has Apple offered an upgrade able laptop alongside a non-upgradeable one? Would be possible to draw that conclusion if they sold a new style MacBook Pro alongside an older style one with similar specs.


Yes. Between 2012 and 2016, Apple sold a version of the 2008-2012 unibody MacBook Pro that had an optical drive and upgradeable RAM, while simultaneously offering the Retina MacBook Pro, the first to solder the RAM to the motherboard. Arguably from a specifications standpoint the Retina MacBook Pro was the superior model due to its high-resolution display and its use of a fast proprietary flash drive (replaced with standard NVMe in later versions) instead of slower SATA flash drives. Eventually the unibody MacBook Pro would get long in the tooth due to lack of updates compared to Apple's annually-updated Retina MacBook Pro models, but it was still sold until it was quietly discontinued in 2016 upon the announcement of the controversial touchbar MacBook Pro.


> Has Apple offered an upgrade able laptop alongside a non-upgradeable one?

No, but I would be shocked if they have never run focus groups for this type of stuff.


Consumers are multi modal, though. Many can't be bothered to dig in and debug or want a sleek highly integrated product. Some others care less for those things and want an upgradeable, repairable product. It's my hope that economic solutions will find the resources accommodate both modalities.


No, because such options are almost completely gone from the market. And I can't honestly believe that there is "no market" for it. It's an anomaly because most PC manufacturers are just trying to imitate Apple.


It makes sense to think this when looking at modern consumer tech, but I haven't met people who actually want that sort of thing. It always seems like people are having to settle.


Is that why the sort-of upgradeable 2015 MBPs nowadays go for higher prices on ebay already than the absolutely non-upgradeable 2016/17s?


The 2015s don't have the "a single speck of dust gets in the keyboard and disables a key" problem. All later models do. I don't think upgradeability factors into it much.


Soldering ram is helpful for preventing people cooling the ram and removing it to attempt to copy private encryption keys


Modern intel (and probably most other) boards use hardware scramblers for other reasons (storing all zeros or all ones causes signal integrity issues / current spikes), and secure the scrambling routine with a different code at each boot.

So, unless I’m mistaken, cooling the ram and moving to another doesn’t work any more.


There are two types of scrambling here - one is the bitswapping and byteswapping that you use to make DDR3/4 routing possible. The other is the whitening function for high speed data that ensures you don't have long sequences of the same value without a transition. The latter is a simple pseudorandom scrambler with a fixed random seed generated at boot. It is not cryptographically secure. The former is a simple substitution and quite easy to reverse (and trivial if you have either a schematic or a board to reverse). Both are deterministic and extremely vulnerable to known plaintext attacks. This is not a security feature.

Source: I'm working on a DDR4 layout right now, and the memory controller scrambling and swapping functions are documented in publically-available intel datasheets (for example, see https://www.intel.com/content/dam/www/public/us/en/documents... sections 2.1.6 and 2.1.8)


Seriously interesting. Any pointers?


Couldn't the 3 people in the World who need that just dob some epoxy on there?


Yes, I can't count the number of times soldered RAM has saved me from this. Can you count to zero?


Nobody is going to do this, because good components are a competitive advantage. I can’t see any good manufacturer wanting their good {trackpad, keyboard, case} either being put in a computer that undercuts them or being forced to dumb down their computer to fit the “lowest common denominator”.


To successfully define a laptop standard, it would have taken a consortium of companies. Likely companies which aren't necessarily in the business of selling integrated laptops themselves, but would benefit from the existence of a laptop standard.

It's likely companies like Microsoft (pre-Surface), peripheral manufacturers (eg, Logitech) and motherboard manufacturers (eg, Gigabyte) would have gladly got on board in that era.

It's likely too late to start this in 2019 (but I may be wrong). Certainly the late-1990s would have been the ideal time for this.


This doesn't work if the peripheral manufacturers are themselves big players (which they are): they can already afford fighting to secure a place in the oligarchy of big players and it's not in their interest to open up space for direct competitors. Whenever you become big enough, you start to share some substantial interest with any other big company: the one of not allowing smaller producers to step in.


As I understand it, only a handful of firms actually design their own laptops. Most of them buy from firms like Clevo or Quanta and maybe do final configuration (CPU/RAM/discs). So you'd really only need to convince them.

In a way, this is much like the situation with desktops-- Dell and HP was/is big enough to come up with their own custom mainboards and cases, but most smaller shops are going to go ATX.

I suspect part of the reason we didn't see much laptop standardization was that the second-tier manufacturers are weaker in the laptop sector than desktop, as well as being weaker as a whole than they were in 2000 when ATX was becoming a thing.

Outside of a few narrow gaming and workstation niches, there are few use cases where you can't find a suitable big-brand laptop, so the second-tier brands (and the manufacturers that supply them) are in a position of fighting for scraps, not one where they can start promoting the benefits of standardization.

This is likely worsened by the mindset that laptops are unupgradeable-- people bought ATX desktops figuring they'd buy new mainboards in 3 years, but generally assume the laptop is going to be stuck in place.


Because most money is in pandering to the lowest common denominator and scale, most manufacturers are starting to make their own hardware/software integrated combo. Apple did this, Microsoft is now doing it as well. Razer is going there, and all of them are (commercially) better for it. On the other hand, it's bad for 'us' (the more hacker-y users) as we have less options. It's why so many stick with sub-optimal solutions like Apple MacBooks (they are not ideal but the other off-the-shelf options are so much worse) or custom stuff (modified Thinkpads and Dell laptops). While the former isn't ideal, it's at least standard and scalable, while the latter isn't. Not really at least.


Incorrect. That happens with desktop PC's.

The reason it hasnt happened in laptops is you would have to compromise size and form


This would allow smaller players to step in and to start grinding some market shares of the big players. It would also turn the laptop market from a high margin market to a low margin one. Standardization is just not in the interest of any big player so it's probably never gonna happen. If you are a small player and want to go that direction you're probably gonna be bought out. The only way i see would be to somehow get pervasive open standards and libre schematics implementing them, and then cut out the big players and get several manufacturers to produce them. But that too is hardly gonna happen because of geopolitical problems: most of these manufacturers are domiciled in china and thus this move would cut to much income from western (and korean, japanese) countries. So for that to happen we would have to relocate some manufacturing industry and somehow not put them in the hands of any of our local big players. The problem here is not some problematic business decisions by companies, it's how we organized our economy. It would take radical changes in the economic/industrial policy to make that happen: much stronger anti-trust laws, which would keep companies smaller and force cooperation; public- instead of private-regulated prices so that you don't die to foreign companies' exportation when you start doing that; etc. This would drive cooperation up in all of the economy, take power away from multinationals, reduce waste, hinder "useless innovation". Long road ahead but i think that's what we need and that's what gonna happen at some point anyway: the capitalistic class may still be floating currently, but at some point the systemic crisis (financial instability, natural disasters, political instability, scarcity of energy and other basic resources) is gonna hit them too. What we have to make sure is that they don't get more out-of-sync with that than they currently are.


It’s interesting how competition used to be encouraged in the U.S. and now it’s pretty much the opposite. It’s all about consolidation, oligopolies and monopolies.

If standards lower margins and make entering easier, that’s what should be regulated for.


Lobbyists rule, and lobbyists are employed by large firms.


Adding more competition isn't an answer, because competition is about individual profit, thus monopoly. Neo-liberalism, global free market etc do encourage a world-wide competition: it's already the current trend. What you probably mean is some kind of "healthy/fair competition": the fact we need to add another adjective hints that this is about doing something to balance it. I argue this is about encouraging cooperation. A good balance between both leads to interdependence, which is exactly what we would like: a state where everyone has some possibility of moving around a bit, but where nobody is free to ignore what people they interact with have to say.

A ref i really want to push: "Mutual Aid: A Factor of Evolution" (1902, Kropotkine). There is a whole part mostly about the evolutionary analysis of altruism (the rest is about analyzing several human social orders throughout history: pre-medieval villages, medieval cities and 19c industrial cities).


There was a trend for a while of making business/power laptops much more configurable (I have an old Dell with a hard drive cage that swaps out without removing any screws). But most laptops are more about form rather than function; their design requires reworking all the internals to prevent getting a big clunky heavy box that overheats.

For very low-power machines you might have tons of internal space free, but more powerful laptops need complex heat management in addition to reducing size and weight. It's only now that we have very advanced fabrication techniques and energy-saving designs that we no longer have to hyper-focus on heat exchange.

If size and heat and weight weren't a factor, you can bet that a standard would have arose to manage interchanging parts. But soldered ram is a good example of why that's just not necessary, and can be counter-productive for reducing cost and maximizing form factor.


My experience has been limited by the fact that components increase at the same rate, and to get everything to place nice(r) with each other, you have to upgrade everything. "A new motherboard, CPU, RAM and GPU" is almost buying an entirely new computer. You save a few hundred bucks by keeping the PSU (or maybe change it too after 5 years) and casing, assuming the ports didn't change.


Being able to continue to use the display, keyboard, mouse/trackpad means you can choose higher-end components.

Even if you don't want to keep your widescreen DVI display from 2008, the interoperability means that when you drop it off at a e-waste center, it's more likely to be reused in its current state for a few more years, rather than immediately recycled (reduce, reuse, recycle!)

I do agree there is some degree of changes in interfaces overtime (like IDE to SATA to NVMe M.2), but if you build a system for a similar intended use case the changes within any given 5 year period are small. This means the upgrades you do over a 15 year period will go from a 2.5" platter drive to a 2.5" SATA drive, or from 800x600 to 1024x768 resolution display, but not both at the same time (with a different but significant set of components being shared every upgrade)


The LG Gram teardown on iFixit was amazing. It's "moderately difficult" to remove almost everything including the trackpad and parts I forgot existed.

https://www.ifixit.com/Guide/LG+Gram+15-Inch+Repairability+A...


That’s crazy. I have one and it seems impossibly light for its power. It’s an absolutely nuts achievement.


Standardization limits innovation. If we had standardized on laptop form factors in the late 1990s all laptops would still be an inch and a half thick, and all screens would still be 4:3.


I doubt this.

The standard ATX form factor has been upgraded to reduce size over various years with the vast majority of accepted iterations maintaining the same mounting hole and IO panel locations. I literally have a mini-ITX board sitting in a case I purchased in 1999. This probably fits more into the fear you state in your comment with a reasonably new technology "forced" to consume more space than is necessary, but I think it argues for the opposite by showing that incremental changes to a standard format can allow for wide ranging compatibilities.

For an example, when ATX was altered by squaring the board to the shortest length (microATX), it didn't require a new case or a new power supply to be placed on the market in order to be consumed because it fit within the "too big" ATX case. Then when cases that only fit microATXe became abundant and another incremental change to the motherboard size to DTX, we again didn't have to release new cases or power supplies or IO cards to start to consume this version. It allowed consumers to purchase and use the boards until they decided they wanted to reduce their case size, amortizing the upgrade costs over months instead of requiring larger up front payments.


How are the laptop companies meant to force you to buy a new one every so often if you can just keep upgrading them?


For desktop PCs, the ATX standard means that the entirety of a high-end gaming PC upgrade often consists of just a new motherboard, CPU, RAM and GPU.

And that's great, if you're into generic beige boxes.

It's been years since I put together my own IBM compatible computers. But in the time since then, I haven't really seen any innovation in desktops.

Yes, for a while the processor numbers ticked up, but then plateaued. Graphics cards push the limits, but that has zero to do with the ATX standard, and more to do with using GPUs for non-graphics computation.

The laptop and mobile sectors seem to be what is driving SSD adoption, high DPI displays, power-conscious design, advanced cooling, smaller components, improved imaging input, reliable fingerprint reading, face recognition for security, smaller interchangeable ports, the move from spinning media to solid state or streaming, and probably other things that I can't remember off the top of my head.

Even if you think Apple's touchbar was a disaster, it's the kind of risk that wouldn't be taken in the Wintel desktop industry.

All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...? I'm not sure. Even liquid cooling was in laptops in the early part of this century.

Again, I haven't built a desktop in a long time, so if I'm off base I'd like to hear a list of desktop innovations enabled by the ATX standard. But my observation is that ATX is a pickup truck, and laptops are a Tesla.


Nearly all of the tech you have in your laptop was developed, tested, and refined on desktops. PCI based SSD were in desktops before NVMe was a thing. Vapor cooled processors were on budget gaming PCs 10 years ago. Even the MacBook trackpad was based on a desktop keyboard produced by a company called fingerworks. High DPI monitors came first to desktop. High refresh rate came first to desktop. Fingerprint reader? Had one on my secure computer 15 years ago. Face unlock a couple years after that.

Desktop is still the primary place for innovation. Laptops use technology that was introduced and pioneered on desktop, then refined until it could fit in Mobile/Laptop. Don't get me wrong, there's probably more work in getting the tech into Mobile than developing it in the first place... But the genesis of the ideas happen on desktop.

Desktop has the opposite mix of freedom and constraints as mobile. Standard internals, but freedom of space. There are dozens of heat-sink manufacturers for PC... Dozens of small teams focused on one problem. There's some variation between chipsets, but nothing that requires major design changes. These teams can afford to innovate... And customers can afford to try new solutions. If the heat-sink doesn't perform, you're out 5% of the total cost. But there's no similar way to try things out for laptops.

For example... Should a laptop combine all of its thermal dissipation into one single connected system or have isolated heat management? It completely depends on usage and thermal sensitivities of the components... It was desktop water-cooling that gave engineers the ability to test cooling GPU and CPU with the same thermal system to determine where to draw the line.


Huh? Most of todays innovation is about power efficiency. This is driven from mobile and laptops, but benefits desktops and servers as well.


>And that's great, if you're into generic beige boxes.

>All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...?

Have you ever built an ATX computer? I assure you, there are plenty of different standard form factor cases out there. The beige box thing was in vogue in the 90s, but today the big trends are sleek black with tempered glass.

And standard form factor desktop does not equal giant tower. You could also do a mini ITX build, a standard that's been around since 2001 for what it's worth.

High DPI displays? This implies high end displays weren't available to desktops first (they were.) A decent CRT could produce much higher DPI than LCDs could (in that era.) Part of the reason why Windows DPI independence sucks is because Microsoft implemented it super early in, without all of the insights Apple had to do it right, and now there's like 4 different DPI mechanisms in Windows.

All in all I'm not sure what really needs "innovating" so badly with desktop form factor. Do we need to solder our RAM to the main board, is that "innovation?"

You kind of say it yourself:

>that has zero to do with the ATX standard,

So would be the case for any form factor standard. It only dictates how things interoperate.


>All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...?

Improved efficiency and the demise of bulky storage devices has created a proliferation of small-form-factor designs. We have two proper standards in widespread use (mini-ITX and mini-STX) and an array of proprietary designs from Intel, Zotac and others. It's now possible to get a fast gaming machine the size of a Mac Mini, or a monstrously powerful workstation that'll fit in a shoulder bag.

https://www.zotac.com/us/product/mini_pcs/magnus-en1070k

https://www.sfflab.com/products/dan_a4-sfx


Strawman. Nobody claimed the ATX standard enabled innovation. It enabled the reduction of e-waste as OP indicated. Think of this the next time you trash your innovative phone or laptop because the non-user replaceable battery/ram/ssd/display/whatever failed.


It's worth mentioning that the utility of a pickup truck dramatically exceeds the utility of a Tesla.


Novel form factors are often how laptop manufacturers distinguish themselves from their competitors. There is enough space within a desktop PC case to formalize a standard. As laptops get thinner and thinner, however, many engineering/layout tweaks are used to fit stuff within a thin chasis. Standardizing across different device models would be asking OEMs to stop putting efforts into competing with each other. And I say this as someone who has just had a catastrophic motherboard failure on their 8-months-old laptop and had to do a replacement that would've cost me a new laptop if outside warranty.


Because size and weight is an important distinctive feature for laptops. Customers pay more for smaller, lighter laptops. Using standardized components and chassis would mean a big competitive disadvantage.


At least for phones there is Phoneblocks[1] they are now part of Google's project Ara.

Maybe it could evolve to a laptop experience if blocks get powerful enough and somebody develops compatible chasis.

*update: The project Ara was cancelled in 2016 [2].

[1] https://phonebloks.com

[2] https://www.theverge.com/2016/9/2/12775922/google-project-ar...


Project Ara is no more...


:( exactly I was coming to update my comment https://www.theverge.com/2016/9/2/12775922/google-project-ar... Seems they cancelled it.


It did not happen for phones either. Why should this be different?

Maybe laptops now are mature enough as a product that what you suggest could be feasible but it is too late for business reasons, now.


Maybe we could try and write an open letter to companies and promise support even for less value at first. Chances are slim, but at least we would have done our part.


I have a Lenovo T430u running Kali, and it is rock solid. I love the keyboard, and I use the TrackPoint for CAD work in FreeCAD. I never feel like it is going to slip from my hands when I pull it from my backpack. It is so easy to open up, that I open it twice or more a year to clean out the fans, which are usually clean anyway; I like seeing the internals like a car mechanic who likes to check under the hood ;)

I considered the Lenovo Carbon X1, but it is pricey, doesn't have a number pad, and is at the ultra-slim form factor of a MBP or other similar notebook form factor.

The Lenovo T580 has the num pad, but the graphics card is the NVIDIA MX150, a mobile but faster version of the GeForce 1030. Not really an issue for me, but my son's Lenovo Yoga came with a 1050 two years ago.

Anyway, I've owned all sorts of notebooks, including MBPs, and have found the Lenovos to be my workhorses, and getting out of my way to get things done. Yes, the battery is only 4 to 6 hours, but for me, even traveling and living all over the world, it has never bit me work wise, only when playing.


Why in God’s name are you using Kali as a daily driver? Anyone who does this has no idea what they’re doing. Kali is made exclusively for pentesting, with a modified and insecure kernel specifically for running certain pentesting apps better.


Your username matches your reactive, hyperbolic and concerned response. I never said daily driver. The T430u is only one of my many laptops, and I have been coding and hacking, since the 70s. I was running MkLinux on Apple PowerPCs in the mid-90s, early Linux distros thereafter. I did some binary analysis and disassembly work back then, and still dabble, as well as other activities that make Kali a great system for this laptop and for what I use it. I also run Windows 10, Ubuntu, Minix, and MenuetOS on other systems, so don't worry.


Can't you tell? he's hacking e-corp


I still haven't watched the show. Is it good?


By far the best onscreen „hacking“ I know of. He does use Linux Mint as daily driver though...


Definitely going to binge watch soon. I have thought of using Redox and writing my own tools. An obsucre OS, but still running across interent/ethernet...oh, wait, spawn a drone node on a Windows machine using MOSREF/Wasp Lisp in Rust or Shen (shout out to doublec/sdunlop!) [1,2]

[1] https://bluishcoder.co.nz/2015/02/19/spawning-windows-comman... [2] https://github.com/doublec/shen-wasp/releases/tag/v0.8


And what have you found to be the issue(s) with running Redox as your OS?


I am not using Redox. I am looking into it currently for its potential of easily porting standard utilities and other programs.


When does he say that he uses Linux Mint?. From what I've seen, his desktop looks like Gnome 2 with a dark theme, or a heavily modified Gnome 3.


I thought he was a KDE fan



I think it's great, but the show changes a lot from how it starts off in the beginning, which understandably puts off a lot of people who were in love with the early bits.


I liked it initially but the later episodes of the first season go off into la-la land which didn't do it for me. Haven't watched the second season.


AFAIK the Kali kernel is Debian unstable modified to allow wireless tools like aircrack-ng suite to do injection and other fun things. So it's just as insecure as any bleeding edge nix, the risk factor really depends on what packages you install and services you run. I wouldn't really recommend it as a daily driver either, but I also wouldn't frown on someone who does.


I've as well always been happy with Lenovos. I'm currently using an X1 Carbon (4th gen, FHD screen, ~6-7hrs battery) and a P71 (4k screen, in-built nVidia disabled, ~4-6hrs battery) and their fans run only when I'm stressing the CPU (e.g. compiling) and even then I only hear the flow of the air.


I would like to try an X1, but I think the P71 is more my speed. I wish they offered an X1 with the same rubbery feel as the T430u I am using now. I am not a fan of low-frictive surfaces, since my nerves are shot in my left hand, and my right is very insenstive, which has lead to me dropping things lately.


Just some small warnings:

the P71 is very bulky (somehow more than how it looks in the pictures) and its case feels a bit cheap (the case is "real" plastic and if you tap on it below the keyboard it really makes the sound of an empty plastic case) => personally I expected a bit more as it's not cheap (the components I chose are basically the cheapest ones with the exception of the 4k panel and the backlit keyboard).

Additionally some weeks ago, while I was typing, the backspace and "t" keys stopped working out of the blue => I switched the laptop off but 1 day later the keys were still not working => I then opened the laptop and extracted the keyboard (veeery easy - compliments Lenovo) to read the model number to order later a spare part, touched a bit the connector cable of the keyboard and after putting all back together they keys magically started working again.

Saying all this just because I have the feeling that overall it could be that in the case of the P71 the quality might be a bit lower than what's otherwise the case for other models. Cheers.


Thanks a lot. I really prefer the solid feel of my T430u, so I will probably go with trying an X1 for once, or the smaller P51 I believe.


How did you learn freecad? I really want to switch to it from auto desk inventor but have spent hours messing around and can't even make a cube. The YouTube videos I was watching assumed I knew too much.


Same here, glad I am not alone. I wasn't sure if I had a bad build or something so I built it from source but even then I couldn't get it to work.


I use Autodesk Fusion 360, which if you are a startup or business making less than a certain amount of money is free to use, and it now includes what were once extras: analysis (FEA), machining, and other features. It really is amazing.

I use FreeCAD, since I have been familiar with it for years. It was once clunkier and had less features, but now it can be used for a lot of the things I need to do.

For a quick start in making a cube, make sure you are in the correct workbench Part or Part Design in the tutorial. The only two issues I have are that it is difficult to interface with clients that use Autodesk, Solidworks, or Rhino products. I have done FEA with the Calculix backend in FreeCAD, and used Paraview to create my FEA pictures for reports. If you know Python, you can even create a cube in the console below. Check out the scripting tutorials. I am not a fan of Python, but I use it in FreeCAD and Blender3D.


You use kali as a daily driver? Wow.

Do you use a non-root account?


It's not my daily driver. See my detailed response above.


Lenovo/Thinkpad + Kali sounds like a match made in heaven. May I ask what wifi chipset is on board? 'lspci | grep Network' should show it. The product page just says a/b/g/n which isn't terribly useful.


It's a 2013 T430u with an Intel Centrino Wireless-N 2230 chip. It only supports 2.4 GHz, and not 5 GHz. I use a second, external USB WiFi dongle for Wireshark and other uses. It's six years old and great to use for more than I thought it would be.


What WiFi dongle do you recommend for good Linux wireshark support?


If you’re looking for something bigger for aircrack etc I use al alfa networks “AWUS036NHA” which is 5dB and you can pick up off amazon for about 30 euro.

For normal dongles a search for “Anadol Gold Line Wifi AWL150 Micro 150Mbit/s USB WLAN S” shows the one I use in various bsds/linuxes with no problems.

Both re0/run0 chips.


It depends on your needs and budget. I picked up a Netgear dongle years ago for $25 that still works fine. Just Google for your linux distro, netgear, and wireshark to find any gotchas.

It all depends too if you are in managed mode or monitor mode. See this: https://github.com/nmap/npcap/releases

Here's some more info. on modes and your capture setup: https://wiki.wireshark.org/CaptureSetup/WLAN/CaptureSetup/WL...


Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: