Hacker News new | past | comments | ask | show | jobs | submit login
An app for M1 Macs that plays the sound of a fan as CPU usage goes up (rambo.codes)
843 points by spideymans 14 days ago | hide | past | favorite | 445 comments



Pre M1, I've always preferred a PC.

When I'm using a laptop, especially doing real work, they heat up, the fans go crazy, the laptop is hot to the touch and everything slows down.

It almost makes you anxious. You're wondering if it's about to crash or go on fire.

I've always kept a Desktop PC on the go for this reason, a well cooled desktop just doesn't have these issues. With a laptop, I always feel like I'm compromising a bit.

I got an M1 Macbook Pro recently and it just doesn't have these issues, I fire up my whole dev environment and get busy, and it's still quiet and (mostly) cool, and I can't notice anything slowing down.

I don't care so much about the architecture differences, x64 vs ARM etc, but the fact that I can finally use a laptop like a desktop is massive.


Same here, it’s like a freaky experience. I run mostly from the browser but have a couple of things going at the same time. One thing I noticed is the blazing fast typing on the Mac. There is an ever so small lag on windows in perceptible when using it. But, when you type on the Mac you notice it, and the brain just feels good using it. I have no idea if this is a real thing, but it’s similar with scrolling, you touch the track pad and things move. Independent of the load I’m running. Now, that many applications are moving to M1 compatible I’m thinking of replace my windows box with a mini+two external hard drives. I can’t imagine what will happen when they release a new processor. This one does more than enough.


One of the reasons iOS feels smoother than Android is because the render loop of the OS is decoupled from the app. The apps aren’t allowed to introduce jank, so if you’re scrolling a webpage and stuff is loading simultaneously, iOS will be way smoother. I think this is also why they can have such low latency on inputs, for example with the Apple Pencil which is much lower latency than the surface pen or the android stylus. I had a 120hz android phone for over a year, and while the frame rate when scrolling is slightly worse on iOS, overall the OS feels more fluid to me. On a 120hz iPad it’s no comparison.

I am speculating here as I don’t know for sure, but I remember iOS started as a derivation of OSX so this may be the case for macOS as well. So I think it’s not your imagination, it’s a different input and render architecture than windows or android.


Android has had a separate "render thread" for ages, though it's per app and runs in the app process. Some animations run on it, but many do not. You can't access it directly from within the app.

The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.


> The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.

This feature has been in the Linux kernel for ages[1]. Android and ChromeOS are based on the Linux kernel, and have had this feature for quite some time. This is nothing new.

[1] https://community.arm.com/developer/ip-products/processors/b...


So why does linux ux feel slower?


Because in many cases, it is. Unfortunately, there isn't as much communication between the GUI people and the kernel people in the Linux community as there is between those same groups at Apple Inc. Not to mention, there are multiple competing groups of GUI people in the Linux community making coordination across these levels difficult. Also, there are many competing interests working on the kernel who might oppose kernel-level optimizations which favor desktop usage at the cost of, for example, server usage. As a result of these and many other factors, Linux's desktop UX remains far less optimized when compared to macOS's desktop UX.

As with most of Linux's rough edges, however, this is trivially fixable if you're technical enough. Of course, that's exactly macOS's advantage. If you want the best experience on macOS, you don't need to be technical.

Personally, I run Linux with a tiling window manager and lots of custom keybindings and little scripts. Having used the default macOS experience a few times at work (on machines far more powerful than my dinky laptop), I can assure you that my highly customized setup feels far more responsive. On the flip side, it required a lot of up-front investment to get it to this point.


I never felt the UI responsive difference between Linux and MacOS but Windows (including freshly installed on powerful many-core machines when) is a different story. The number one reason I ever switched from Windows to Linux is the latter always feeling way more swift - UI responsiveness always remaining perfect and some background tasks also working much faster. And I never actually used lightweight WMs - only KDE, GNOME and XFCE. The first time I've noticed some slowness in Lunux was on RaspberryPI (4, with the default LXDE).


This.

I think the main advantage with macOS is that it's above all else a system designed to be used interactively, as opposed to a server, so they don't have to put out a configuration "good enough for most things".

I also run Linux with a tiling window manager on an old machine (3rd gen i7), and it flies. One thing that made a huge difference in perceived latency for me was switching the generic kernel with one having Con Kolivas' patch set [0].

I'm using Arch and sign my own EFI binaries, so I don't care about out of tree patches, but for Ubuntu users and similar who don't want to mess with this, there's an official `linux-lowlatency` package which helps [1].

---

[0] https://wiki.archlinux.org/title/Linux-ck

[1] https://packages.ubuntu.com/search?keywords=linux-lowlatency...


X-Windows: …A mistake carried out to perfection. X-Windows: …Dissatisfaction guaranteed. X-Windows: …Don’t get frustrated without it. X-Windows: …Even your dog won’t like it. X-Windows: …Flaky and built to stay that way. X-Windows: …Complex non-solutions to simple non-problems. X-Windows: …Flawed beyond belief. X-Windows: …Form follows malfunction. X-Windows: …Garbage at your fingertips. X-Windows: …Ignorance is our most important resource. X-Windows: …It could be worse, but it’ll take time. X-Windows: …It could happen to you. X-Windows: …Japan’s secret weapon. X-Windows: …Let it get in your way. X-Windows: …Live the nightmare. X-Windows: …More than enough rope. X-Windows: …Never had it, never will. X-Windows: …No hardware is safe. X-Windows: …Power tools for power fools. X-Windows: …Putting new limits on productivity. X-Windows: …Simplicity made complex. X-Windows: …The cutting edge of obsolescence. X-Windows: …The art of incompetence. X-Windows: …The defacto substandard. X-Windows: …The first fully modular software disaster. X-Windows: …The joke that kills. X-Windows: …The problem for your problem. X-Windows: …There’s got to be a better way. X-Windows: …Warn your friends about it. X-Windows: …You’d better sit down. X-Windows: …You’ll envy the dead.

https://donhopkins.medium.com/the-x-windows-disaster-128d398...


> like Sun’s Open Look clock tool, which gobbles up 1.4 megabytes of real memory!

It's funny to read this in an era when smartphones come with 6 GB of RAM to compensate for developers' laziness and unprofessionalism.


Almost all major distributions use Wayland as of today.

move on its 2021....


I use Plasma Desktop and it's been more responsive than macOS was, so I don't know.


On the exact same hardware linux always felt much faster to me, I remember doing stuff like resizeing finder windows vs KDE's dolphin, finder would be all janky and laggy and KDE wouldn't miss a frame



Yeah, my go-to for speeding up Macs is to throw Linux on them.

>the render loop of the OS is decoupled from the app

Can you elaborate on this? If you do too much work on the main thread in iOS, it's going to hang the UI. Isn't the main thread the "render thread"? Do scroll views have some kind of special escape hatch to get off the main thread for continuing to scroll if the main thread is blocked with loading the content?


I believe the point is that the "main thread" for your iOS application, is not the main thread of the OS. They're totally decoupled.


Err, same with Android? And every OS ever. That's just standard process isolation. Or am I misunderstanding something?


There’s some deep dive articles on the way the input loop works. But OP is correct, and the reason iOS feels smoother. Android has a lot more UI lag.


I don't think Apple has any special tricks for input loop.

Some Android phones really have input lag, but it is not caused by CPU load. For example, on my phone, there is approximately 100-150 ms lag between tapping the screen and registering the touch. The lag is not caused by CPU load, but by slow touch sensor.

I don't think Apple has any smart code optimization tricks. Either they have a faster touch sensor or just some people believe that if something is made by Apple then it is of a better quality.

Here is a comparison of input lag in games in iOS and Android [1] and it shows similar results: 80 ms between a tap and reaction on screen.

[1] https://blog.gamebench.net/touch-latency-benchmarks-iphone-x...


They do! See https://devstreaming-cdn.apple.com/videos/wwdc/2015/233l9q8h... and the corresponding WWDC session on them minimizing the input-to-display time.


This is off-topic, but I love that the title of the slides is "233_Advanced Touch Input on_iOS_04_Final_Final_VERY_Final_D_DF.key".


I wonder why somebody working at apple wouldn't just use git for that?


<laughs in Windows 3.1>


UIView animations do not run on the main thread, and will not be blocked if the main thread is blocked. This does help a bit with keeping the OS feeling smooth, but it is far from the only reason.


This is not entirely accurate. iOS indeed uses a client-server UI model, kind of similar to X11. Along with submitting “widget” hierarchy updates, it also supports submitting “animations”. The downside is that the animation states are not truly accessible by the actual app after it submits them.

The scrolling animation is 99.9% of the time implemented as a client-side animation timer submitting non-animated hierarchy updates to the server. It’s common to have janky scrolling.


> Along with submitting “widget” hierarchy updates, it also supports submitting “animations”.

Is that how all these third party iOS apps all have completely consistent “pop” animations on long press?


No, that would be OS-provided GUI components (or sometimes manual reimplementation), similar to how most win32 apps had the same right-click menu behavior.


Off-topic: I see the lack of standardized OS-provided GUI components in Unix/Linux distros as the main root-cause for low Linux adoption. I'm assuming there isn't such a thing since I haven't been able to notice any consistency ever in the GUI of any Linux distro and/or any GUI app that runs on Linux :P But I may be totally wrong.

On-topic: they should build an M1 app that simulates loud coil whine at low-CPU usage, so you can feel like you're using a Dell XPS.


Well, there are common components, it's just that there are multiple standards (Qt, GTK, wx, etc)...

I'm using a tiling window manager with mostly GTK apps, so pretty much all menus and such look the same. The worst offenders are Firefox and IntelliJ, although they have improved a bit lately.

However, I'm not sure that this is the reason for lack of adoption. Windows has always been a patchwork of interface design, every other app having their own special window decorations and behavior, including MS' own apps (thinking of Office in particular). Also, seemingly similar elements, such as text inputs, behave differently in different places. For example ctrl-backspace sometimes deletes the previous word, sometimes it introduces random characters.


The unique thing about this is that it takes a widget from the app, and expands it out while blurring the rest of the screen. So it’s not just an OS gui component.


Blurring happens inside the UI server process. Here is a related technique in macOS: https://avaidyam.github.io/2018/02/17/CAPluginLayer_CABackdr...

Basically, it's like an iframe — you declare what contents this portion of screen should have, and the render server composes it onto the screen. The render server is shared between all apps (just like in X11), so it can compose different apps with dependent effects (like blur). Apps don't have direct access to the render server's context, just like a webpage doesn't have any way to extract data from a foreign iframe.


I wonder what made apple give the iPad Pro such an awful screen though, considering all the software optimisations that they do. I got the M1 iPad a month ago, and the screen has absolutely horrendous ghosting issues, like, what is this? An LCD from 2001? Just open the settings app, and quickly scroll the options up and down - white text on the black background leaves such a visible smudge, it bothers me massively when scrolling through websites or apps in dark mode, it honestly doesn't feel like an apple product to me. Honestly haven't seen this issue on any screen in the past 10 years, and here this brand new(and very expensive) iPad with 120Hz screen has something that looks like 50-100ms gray to gray refresh time.


This is a very interesting anecdote considering the m1 IPad Pro is supposed to have one of the best screens available on any device that form factor, the same XDR display technology as their $5000+ monsters. Every reviewer has mentioned the display as the selling point. I have looked at them in person and have debating buying one, but Next time I’m at an Apple Store I’ll want to see if I can replicate what you’re seeing.

You might be experiencing the mini-LED effect where the back lighting is regionalized, which isn’t ghosting but can be noticeable.


It's on the 11" model, so definitely not a mini-LED issue.


Edit: the parent commenter does not have a miniLED iPad.

If you have the 12.9” M1 iPad, the ghosting is likely due to the new miniLED backlight which was introduced with that model.

This backlight is not static, but tracks to content in order to increase effective contrast (similar to FALD backlights on higher end LCD TVs). If the backlight does not track the LCD content fast enough, there can be ghosting.

In addition, since the LEDs are large compared to pixels, you can sometimes see haloing, particularly in small bits of white on a black background.

Overall, while the display is great for video, especially HDR video, it has some problems with UI and (white on black) text.


It's on the 11" model, so regular old LCD model, not the new fancy microled.


No issues on my iPad you might want to take it to Apple.


I’m guessing it must be an issue with those new miniLED screens, or some other significant generational problem. I have two older 12.9” iPad Pros, one 1st generation and one 2nd generation (the first with the 120hz display). They are both excellent displays with no such issues with motion or ghosting.


I’m worried the upcoming upgraded MBP will only have this option. Although I read they released an update that should minimize this issue. Have you tried that?


To this note, I've noticed huge discrepancies in display quality in iPads. They were passed out in my last year of high school, and each one even appeared to have slightly different color grading. The whole thing was pretty funny to me, especially since I have no idea how artists would be able to trust a display like that.


Display on microled Ipad pro is Apple's first more or less own display. The panel itself is LG's I believe.


Recently I visited the Apple store and compared the latest 11 inch iPad Air (A14 processor) and iPad Pro (M1 processor) models side-by-side. I loaded up Apple’s website in Safari and scrolled. The Pro is noticeably butter-smooth, while the Air stutters. My iPhone also stutters in the same way, but it’s never bothered me before. It’s only after looking at the performance of the M1-driven iPad Pro that I knew to look for it. And I previously had a similar experience noticing how much smoother my iPhone was than my old Android phones!

I don’t know for sure the processor is the difference, this is just a report of my observations.


That's not because of the processor. It's because the newest iPad Pro has a 120hz display and the other devices that you were comparing to have a 60hz display.


While 60hz screen refreshes are less smooth than 120hz, the small (but noticeable) of a difference due to refresh rates wouldn't be correctly describable as "stutter".

The M1 processor makes a real difference. I have both the M1 and the most recent non-M1 iPad Pro's.


Yeah you really don’t notice 60Hz scrolling until you try 120Hz scrolling. It’s a bit like you didn’t notice Retina displays until you tried one then looked at a pre-retina display. It’s crazy how you adapt to perceive things once you see something better/different.


Maybe I should hold off on the iPad Pro until I can get 120Hz on all my devices.


This is not really true, at least in any sense that really differs from other OSes. And, if you watch closely, you will notice that poorly-behaving apps running in the background can and will introduce jank in the foreground process. Since iOS lacks good performance monitoring, this (along with battery consumption) has historically been the easiest way to figure out if an app is spinning when it shouldn't be.


I'm sorry, I have an Samsung Galaxy S20 Ultra with 120mhz. Using my girlfriends iPhone 12 makes me dizzy.

120hz is something you don't notice much when you enable it, but you definitely notice when it's gone.


Typing lag is such a sad result of all our modern computing abstractions.

https://www.extremetech.com/computing/261148-modern-computer...


Hm. I've always thought it was more of a result of our current display technology? Digital displays buffer an entire frame before they display it. Sometimes several frames. And the refresh rate is usually 60 Hz so each buffered frame adds a delay of 16 ms. CRTs on the other hand have basically zero latency because the signal coming in directly controls the intensity of the beam as it draws the picture.

Anyway, is it any better on displays that have a higher refresh rate? I feel like it should make a substantial difference.


CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

All CRT displays attached to computers in the last 40 years were driven from memory buffers just like LCDs, and those buffers were typically only allowed to change while the electron beam was "off", i.e. moving from the bottom of the screen back to the top. Letting the buffer change while the beam is writing results in "tearing" the image, which was usually considered a bad thing.


> CRTs are potentially worse.

Video game aficionados would like to have a word with you:

https://www.wired.com/story/crt-tube-tv-hot-gaming-tech-retr...

To be fair, much of this is the color and shape rendering, where pixel art had been tailored for CRTs.

Twitchy gamers do swear by “zero input lag” but are perhaps just nostalgic, difference is likely to be 8ms vs. 10ms:

“Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag…”

https://www.resetera.com/threads/crts-have-8-3ms-of-input-la...


As you said that article seemed to be more about the appearance of objects on a CRT than lag, and I kind of agree with the nostalgia crowd in that respect. But [raster] CRT lag is always going to be 16ms (worst case) and will never be better, while LCDs can in theory run much faster as technology improves.

If we shift the discussion to vector CRTs (which have no pixels) such as the one the old Tempest [0] game used, the CRT has a major advantage over an LCD and the lag can in principle be whatever the application programmer wants it to be. I miss vector games and there's really no way to duplicate their "feel" with LCDs.

[0] https://en.m.wikipedia.org/wiki/Tempest_(video_game)


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen.

Back when I had CRTs, 60Hz displays were the older, less-common, cheapo option. I'm having a hard time remembering a CRT display that wasn't at least 75Hz (I believe this was the VESA standard for the minimum to be flicker-free), but most of the monitors I used had refresh rates in the 80-90Hz range. I remember a beautiful higher-end CRT that had a refresh rate around 110Hz.

85Hz gives you a frame time of 11ms, which doesn't sound much better, but is a 30% improvement over 16ms.


Before multi-sync CRTs and SVGA, 60Hz was not the "cheapo" option.


I don't think you can get a display slower than a TV, and they do in fact update at ~60Hz (or 50Hz, depending on region). Of course you're probably only getting VGA, 240p, or less in terms of pixels.


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

This is exactly the same as LCDs though, no? LCDs are also drawing an entire frame at a time, they're not "random access" for lack of a better term. There's just typically no image processing going on with a CRT* though, so there's no inherent latency beyond the speed of the electron gun and the speed of light.

*I'm aware there were some later "digital" CRT models that did analog->digital conversion, followed by some digital signal processing on the image, then digital->analog conversion to drive the gun.


I don't think that LCD buffer anything. I've experienced screen tearing which should not happen with buffering. Most applications implement some kind of vsync which introduces buffering and related delays indeed.

Best option is to use adaptive sync and get rid of vsync. But support for this technology is surprisingly not mature, it works mostly in games.


Screen tearing happens when the software swaps buffers on the GPU while the GPU is in the middle of reading out a buffer to send to the monitor. That tearing has nothing at all to do with whatever buffering is or isn't happening in the display itself, because the discontinuity is actually present in the data stream sent to the display.


See also this talk by John Carmack about buffering problems in input and output stacks.

https://www.youtube.com/watch?v=lHLpKzUxjGk


Sounds broken? Return it.


You have what I called Latency Sensitive or Jank Sensitivity. The most important thing in these environment is actually not how fast they are ( or how low the latency are ), but the latency being "consistent". With Windows and Android there are micro Jank everywhere. I know because while most people dont notice, I felt a small pin push like pain on the right back of my head every time it happens.


Depending on what you're coming from, it could just be you haven't been using modern hardware. I agree some is faster than others. I notice a difference going from my Ryzen 5900x to my i9 11900K. Typing and everything is definitely smoother on my latter machine. Part of it is just software advantages. Apple designed the compiler and built the system for the M1. On Windows and Linux, Intel has a boatload of engineers working on their behalf.

The single core performance of my 11900K is really hard to beat at 5.3GHz, and with 7 of 8 cores hitting that frequency simultaneously (which it hits all the time in desktop use) it's just buttery smooth. My guess is that you just came from an older system. The advances in CPUs (along with good compiler programmers working on that architecture's behalf) in my experience are larger than reviews and colloquial recommendations would suggest.


Just FYI from a person who's used Macs since December 1984 and Windows since DOS, the instant UI responsiveness of Macs (when using any input device), regardless of load, was ALWAYS a feature, even going back to the very first Mac. I always noticed the lag on Windows and couldn't believe it didn't bother anyone... which of course it didn't, because it's all most people knew!

As a hardware nerd I always kind of like that moment when the CPU starts really working and you hear fans spin way up. The auditory feedback is like a throwback to a more mechanical age, like a manual transmission vehicle where you can judge when to shift by engine speed.


When I replaced all my spinning disks with SSDs and fans with ultra quiet ones I realized just how much I relied on fan noise to signal things. E.g. if my computer isn’t visibly on, the fan/disk noise tells me whether I need to power it on or just wake it. I’ve had to start changing my habits as a result.


And it dates back from 1988 for me. I’d always rely on the nice scratching sound of the HDD and the LED to know whether my computer had crashed or was just loading. There was no need for a spinning wheel when you could hear the data getting loaded ;)


I could tell what boot phase my 386 + 486 desktops were at by listening to the ticking of the HDD heads. When I installed Win95 on the 486 (the poor thing), the time until the o/s was finished with the drive quadrupled.


At the same time, we are losing disk access LED.


... and then you get an electric car, and the only time you have any auditory feedback is when it's very hot out and you supercharge.


I like it if the laptop only starts the fan up when the device actually is working. Not constantly turning it on and off during like light web browsing.

My 2018 MBP 13” and the Surface Laptop I’ve tried did that well. An XPS 15 I’m currently using does not.


Web browsing isn't a light activity anymore. Web pages consume more memory and more CPU than my text editor


Has there been a time in history where web browsing was lighter than text editing, ever?


No, but there is now, "thanks" to Electron :P

More specifically, HN would use less RAM than VSC would to open a .txt file.

Eh, not bad. not good; we don't know what the new goalpost positions mean yet.


That’s why I specified “light web browsing”.


I dont understand why they sell the pro. The touchbar is rubbish and the air has 99% the same performance. Yeah the pro can go faster for longer without throttle but its heavier and doesnt have those physical f-keys.

Its more expensive too… not sure why anyone would buy it over the air. Two more usb-c but you can only hook in one external monitor! Also 200 more nits on screen but… is it really that important?

I cant wait for the 14/16 with the m1x. These are already barn burners.


Obviously, to people who choose to pay for it. I produce video news and the Touch Bar is integral to my workflow. Just because something is not useful to you doesn’t mean it’s useless.


The touch bar should have been in addition to the function row really. On the 16" mac there is plenty of space for both. Few people hate the touch bar, they hate the lack of function keys.


I hate it because it’s too easy to accidentally press a “key” and trigger some potentially large scale event.


This has always been my issue with it. For some reason it doesn't have pressure sensitivity like the trackpad (which also has no physical button). Lightly brushing the touchbar is enough to activate the controls on it.


I hate it because it takes three taps and a delay to change the volume or brightness, 99% of my use case.

Yep we share the same unpopular opinion. I really like the Touch Bar.


I do like the idea - and I saw a Windows laptop with a bigger 'touch bar' that I think is even more practical - but I just wished they offered a version without it. I heard the value of just the touch bar is in the region of $600 (random guess), because it adds some hardware from the Apple Watch to the Mac. That's a lot of money for a feature like that.


I'm curious, how is the touch bar used in your workflow?


Final Cut Pro most likely.


Correct.

The Touch Bar is also well-integrated into QuickTime Player as well as Quicklook, making video navigation much more efficient.

I only wish VLC would use it for timeline navigation.


And the rumors say that Apple will be removing it. Let's wait and see for the upcoming rumored 14/16 Macbooks.


> Two more usb-c but you can only hook in one external monitor!

Actually you still only get two USB-C ports on the M1 MacBook Pro. The four-port model is still Intel.

That lower-spec Pro has never made much sense. The last time an entry level MacBook Pro made sense was 2015, where you got a noticeably higher TDP chip (28W vs 15W in the Air iirc—could be mistaken), more IO (two Thunderbolt 2 ports vs the Air's 1, and a full-size HDMI port), and of course an IPS retina screen.

I think the low-end Pro came to existence because they intended it to replace the Air, but the tiny 12" MacBook had already claimed the unadorned "MacBook" name, which would have made a lot more sense.

But then they ended up refreshing the Air as people just never stopped buying the old one, despite its ageing internals and garbage 2010-era screen (possibly because it was the only computer Apple were selling at the time with a keyboard that worked).

I have no idea while it still exists though. It's filling a niche that isn't there. I think the intent is for it to be basically an "upmarket" alternative to the Air, but they only thing they have to offer right now is a Touch Bar. Which for most people, including myself, is more of a reason not to buy it.


I used Macbook Airs for 10 years. Had a week with a loaner machine which happened to be an MBP with TouchBar and I found it neat, and so ended up getting an M1 with TouchBar. I like it. I so rarely use the F keys that it's kind of nice to have sliders, emojis, mute button, et cetera up there. I almost certainly could live without it, but I used to hate on it and now at least opted for it.


I kinda like it too but I resent it a lot because they displaced the function keys. Had they kept both, I think we’d be seeing more touch base fans.

A lot of the time it’s really nice to have a context sensitive/modal form of input.


It actually has the same number of USB-C ports, the M1 MacBook Pros only have 2. But, from the perspective of someone who bought one, I got it for the sustained performance provided via the active cooling and for the extra battery life. Would love the physical keys instead of the touch bar, but I've used a touch bar for the past 5 years so it's whatever.


Having enjoyed using an Elgato StreamDeck on a windows box for a while, my next MacBook Pro upgrade cycle definitely includes a touchbar on the wish list.


>he touchbar is rubbish and the air has 99% the same performance.

It only have 99% of the performance under specific loads. The Air is ~12W TDP, the MBP is a ~20W TDP. That is quite a bit of difference ( 70% ) when you absolutely pushes it to the limit in both CPU and GPU.

That is why half of the not "throttle" comment in this thread dont make much sense as they only consider the CPU usage pattern.


The point is that there are very few real world workloads that get it to throttle, and even more rarely will it throttle enough to matter. 99% of use cases are not affected.


MacBook Pro, by Apple's pro definition aims at creative professional which tends to use GPU a lot more. Which means they are much more likely to be throttled by TDP limitation in a MacBook Air.

There are also a lot of other use cases where GPU or NPU are important.


This.

As I wrote in another comment, I produce video news.

My M1 MacBook Pro is a beast of a machine, but I work it to the point of full-blast fans every day.


> you can only hook in one external monitor

You can run multiple monitors on an m1 with a DisplayPort hub. (not relevant to the point about the number of usb-c ports of course, since you're plugging into the hub anyways)


Yep, the touchbar is pretty horrid. It's my personal decision point for the question "linux or mac?" If Apple fails to provide a pro model without the touchbar, my next laptop will be Linux.


The differences are larger battery, active cooling, touch bar, brighter screen, better speakers, and the base ram/storage model has the 8 core GPU.


The Touch Bar makes me so angry. Why? Because it's a less reliable laptop with a worse keyboard that exists for no other reason than to raise the ASP (Average Selling Price) of Macs. That's it.

Go back 5+ years and you had the (2010+) Macbook Air, which was fantastic. A good compromise of size and power at a really low price. Like competitors just couldn't compete. The only problem? By 2015 or so the display was pretty outdated. It needed a thinner bezel and a retina display and it would've been perfect.

But no, Johnny Ive came along with his war of thinness and foisted the butterfly keyboard on us and added the Touch Bar to increase the price. There's no other reason for it to exist.


My old clevo laptop has an i7 in it with fans that sound like hair dryers. It never throttles.

I would never buy a laptop where throttling was a presumed solution to thermal issues, because I want to get work done. Pre-m1 macbooks were awful. Even worse than top of the line dell laptops.

I have always felt like a minority for expecting a 2k laptop to actually be able to get work done.

If I would ever be able to run Linux as a first class citizen on an M1 (not going to happen) I would buy one in a flash. I have been playing with my mother in law's MacBook air and the thing is pretty darn amazing.


If you're looking for a PC that's not a headache to put together, is semi-mobile, powerful and quiet have a look at AMD-based gaming PCs.

I got a Lenovo Legion with 4800H(8/16)/32GB/2T/GTX1650/144Hz a couple of months ago. That's a really sufficiently cooled system. It almost never spins up the fans, it's powerful, has a lot of space, no RGBs and has a super boring, tasteless, anti-theft design that never goes out of style (but the last one is more of a personal preference ;) It's been great.


One eternal problem with gaming laptops and Linux remains the dual-GPU architecture, which is always kind-of-but-not-quite supported. Most often it is easier to forego either the GPU or CPU-video, leaving the overall experience frustrating.

The fact that most (all?) gaming laptops come with NVidia GPUs (requiring closed drivers, unless Nouveau got much better since last time I tried it) doesn't help.


While I understand that this sucks, I've been running Windows as a primary dev machine for a long time now, so this particular problem doesn't affect me personally.

> It never throttles.

Just in case, are you actually monitoring temperatures?

Regardless, buy some proper thermal paste and pads and replace them on everything, it will quiet down significantly while also running cooler.

The difference between factory Thermal Interface Material and custom is often ridiculous.


I do sometimes when I do heavy things on it (long and heavy compiles or when I'm rebuilding some heavy datasets). I never tried with any synthetic benchmarks. The Mac throttles whenever I look at it the wrong way.

Edit: and I don't mind the noise. I will maybe do something if it starts throttling, but I have had 0 issues for as long as I can remember.


FYI, the one M1 MacBook which throttles under heavy load is the MacBook Air, which is actually a 1k laptop ($999).

The MacBook Pro has fans and basically never throttles as I understand from others.


Absolutely untrue. My M1 Air does not skip a beat at all even under sustained load. Barely gets warm.


It’s not untrue. It has been demonstrated in dozens of reviews that the M1 Air throttles under heavy load. Just because it hasn’t done so for your workload, doesn’t mean it isn’t happening for others.

Besides, when it throttles, it’s not a huge slowdown, so most people wouldn’t notice it “skipping a beat” anyway.


An MS Teams video call with 4+ people will get my M1 Air very toasty (because Teams is shit).

However, it's not as hot as my old 16" MBP got (which would get too hot to have on your lap), it doesn't noticeably get slower, and it's of course totally silent.


I know. I have an M1 and and it hardly ever throttles. I was making a point to the OP though


> a well cooled desktop just doesn't have these issues

Kind of depends on the steady state and your target turbo frequency. I have my fans basically idle when temperatures permit it, but aggressively ramp them to full speed to keep the CPU as cool as possible so as to allow the frequency to scale up beyond spec. This results in a similar effect as a laptop fan -- under heavy load, the machine is loud. I don't really have a problem with this, at idle my computer is using ~100W, at full load, it's nearly 1000W. That's a lot of heat to get rid of, and it has to go somewhere. Having a large water cooling loop can help with the noise, but since I really only use the CPU that heavily for bursts that measure in the minutes at worse, I think it's fine.

It doesn't make me anxious but I do kind of like the feedback that things are under load. (I have the ramp-up time set pretty long, so something like "go build" that only uses all 32 CPUs for a second doesn't really trigger anything. There's enough inertia to prevent that from raising temps much.)


Must be nice, using a fresh machine that's so silent. I recently switched from a powerful desktop PC with noisy fans to a passively cooled RPi4 running Ubuntu Mate. I just feel my productivity and focus going up now I don't hear the equivalent of a small fighter jet taking off. The RPi4 is still a fighter though, very quick and powerful for it's size, price and specs.


I like my 15” MBP, but the fans are extremely annoying. I had to switch IntelliJ to power save mode so I can hear what I’m thinking.

It’s quite surreal the big Xeon server (an actual server, with optional rackmounts) with lots of spinning rust under the desk is still much less distracting.


You might want to look into [Turbo Boost Switch](http://tbswitcher.rugarciap.com), which in my experience does a pretty good job of keeping the CPU cool enough that the fans stay pretty quiet, with a minor impact on performance. (It's easy to turn it on and off from the menu bar, in case you need to turn the knob up to 11 and are willing to hear the fans again.)


This worked well for me also.


Thanks for the tip, folks. I passed it to my IT department for checking (PII, sensitive client data, etc) and, hopefully, approval.


Is this a "hardware have gotten fast enough" situation?

My latest laptop (Ryzen 7) takes a lot of beating before putting the fans at an audible speed. I've never seen this happen in laptops before either. (But well, my last one didn't slow down either, it just started making some noise.)


My Core i9 laptop has fans on just about as soon as it boots.


I just looked up some benchmarks. Apparently a m1 is 2-4x faster at many tasks than my wife’s gaming desktop from 5 years ago. It’s comparable in many benchmarks to my ryzen cpu as well?

What’s the catch?


The only real catch is that you're currently limited to only 16GB of RAM.


I switched from a 16" Pro with 32GB of RAM to a 16GB M1 Air - I really didn't notice the difference, even though I frequently run some memory hogs (Electron apps, Chrome, multiple Docker containers)


and big sur and up... unfortunately the quality of the OS is not keeping up with the quality of the hw.


How so?


let's not go there :} life is too short to compile bug lists for macos.


I feel like the critical bug list being short is basically THE reason power users would go for Apple laptops.


The best OS of existing.


Surely you're joking here.


No.


I haven't found that to be a catch actually.

I went from a mid-2015 15" MBP to 13" M1 with 8GB. I made the impulsive purchase to get 8GB ASAP rather than wait for 16GB (which was backordered). My primary apps are DaVinci Resolve, Lightroom and Photoshop. On the 15", the fans would run on full constantly. I would have to create proxy video files to work with.

On the M1, I don't create any proxies at all and the video files I work with just play in real-time.

Main catch for me is the two USB ports and one external screen limitation. I have a hub (card readers, USB, HDMI, etc) and a 10-port USB hub coming off that (with 7 external drives connected). As soon as there is a 15/16" available that handles two external displays, I will get that as well.

I dislike the TouchBar and the lone external display, but everything else about the M1 has been excellent for me.


I had assumed that photoshop and the like would be gpu bound, not cpu bound


Most likely her gaming PCs GPU is on par with or exceeds the M1s GPU. That would be the biggest downside in your specific comparison.


You can only use one external monitor


And that's only on the laptops. Two monitors are supported on the Mac Mini.


That is nevertheless only two screens.


>only two screens

Honestly I feel like multi monitor setups are actually worse than single monitor. Having to bend your neck to use some of your windows left me with neck pain. Now I only use one large monitor and use the workspace switching shortcuts to move everything around so I only look forwards.

My productivity is identical and my neck feels better. I'd only say its worth it for when you really do need to see a huge amount at once like watching the feeds of 30 security cameras. Not for general work/programming.

Seems like multi monitor setups become more of a looks thing because they make people feel like they are sitting in the FBI command center rather than a generic office worker desk.


Multi-monitor setup has been a giant boost to my programming productivity.

I can simply keep the specification/documentation/stack overflow alongside editor/IDE window alongside debugger. If I had to put the benefit in words, I can offload the state of my programming session all on screen, instead of in my head, taking space where more of the abstract logic and relational graph can reside, before it gets turned into code.

On the other hand, it gets way easier to get distracted because you can keep a chat window on one side, constantly getting pinged.


I used to do the same thing but now I put my IDE on the middle workspace with the left and right workspaces holding my IMs/music/browser and terminal. Now instead of moving my head to see other windows I press ctrl+alt left/right and the windows move in to the space I am already looking. Effective use of workspaces feels just as productive as multiple monitors did but my neck is always facing forward.


That's a problem with two monitors, specifically if you try to place them evenly. Neither monitor is centered.

Designate one of them as the 'main' monitor, and put anything you need to work with for long periods of time on that. Add an office chair, so you can swiwel to look straight at the other(s) instead of twisting your neck. And you'll find there's no such issue.


I just wish I could find a good series of monitors with matching vertical size and dot pitch but different aspect ratios. My best use case would be a 27" 16:9 primary monitor directly in front of me with a 4:3 secondary on the right or left. It would be fantastic to put a 1920x1440 next to my 2560x1440 center display with them being the same physical size.


I only really use 2 monitors in unreal engine. The ui just is too cramped for a single monitor, you really need both

Outside of unreal, I usually find workspace switching to be a better workflow than alt tabbing my way to a window on a second screen


hdmi port doesnt work on many m1 mac minis, is this counting the two usbc or hdmi plus 1?

That's not too bad if you're using an ultrawide.


You can use a DisplayPort hub to run multiple monitors.


The catch is that you are using untested hardware. You have no idea if this will last, who will support what, and potentially risk money and your data. Thing is the upside for us as a consumer is insane. Tbh buy a MacBook is 20/30% more expensive than equivalent type specs on windows. But in the case of the air that’s 200/300 dollars. Just to try and test this new hardware if it’s your main device. It’s a lot of risk.

I feel confident running the system for a while now. Seeing more software come on board. Even for the extra money, I say go. I’m a dev/product designer, most of my work is coding, product development, product management and graphical design. It does everything I need faster, no lag.


Same.

The problem is now my work desktop is a 32-core Threadripper w/ huge GPU and way too much ram.

It’s kinda nice knowing that if my computer ever slows down it’s because of bad programming and not having a too slow computer.

I want to like laptops. I really really do. But they’re slower, have bad keyboards, and have tiny screens. Or I’m docked and the fact that it’s a laptop isn’t meaningful.

That said, I can’t fricken wait to see Apple build a beast of a desktop with their silicon. I just really really hope the GPU doesn’t suck. Lack of good GPU on Mac is a huge problem for certain lines of work.


> Or I’m docked and the fact that it’s a laptop isn’t meaningful.

You might be underestimating the value in that. I’ve been working off of MacBooks since 2011, and for almost all of that time I’ve had them docked and hooked up to multiple monitors, and working just like a desktop.

But if I ever need to go anywhere, meet a client, travel, go to another room/office, I just unplug and go. If power goes, I don’t lose anything, the laptop is still running. Whatever happens, it’s all there with me, always. It’s not as comfortable when it’s on laptop mode but it’s better to have a less comfortable experience than not having it available at all.

It’s a portable desktop. And I love it. But yeah if you’re looking for workstation-level specs, then yeah, a laptop is never going to be enough.


My primary home computer for several years was a docked laptop. Overall I was disappointed and went back to a full desktop.

> If power goes, I don’t lose anything, the laptop is still running.

What kind of monster doesn’t use a UPS!? =P

I mostly work in games and VR. Maybe someday there will be a laptop that doesn’t suck for game development. Sadly that day has not yet come.


I have a desktop dev machine that I ssh into via zerotier and it has been a fantastic dev experience. As a result my entire world needs to be in the terminal, which for me, was pretty easy (tmux+neovim). `tmux a` and I'm right back where I left off and it doesn't matter what front-end computer I use. I can now use my ipad as my front-end cpu which is a great in-the-bed machine.

The only catch is I need to make sure my dev machine is online.

Now with neovim 0.5 with support for LSP and treesitter, neovim is on par with visual studio code.


>It’s kinda nice knowing that if my computer ever slows down it’s because of bad programming and not having a too slow computer.

And then someone with 4 cores tries to run the code and its unusable because it only runs well with 32 cores.


Yes. Software developers need to make sure their software runs well on their user’s machines. This is equally true for people developing software on brand new laptops and it running poorly on older laptops.

A great pet peeve of mine is that designers are notorious for only testing their designs on high-resolution MacBooks. A lot of tools look like crap on 1080p Windows displays.

The benefits of 32-cores mostly comes with compile times and build processes. Where, depending on your project, it can make a HUGE difference. One of my C++ projects went from 15 minutes to under 3.


I've been on a ThinkPad X1 Extreme for almost 2 years (gen 2 with the i9) and never had any noise or throttling issues. I sit next to my partner (also a dev) and their i9 macbook pro regularly spins up then throttles the cpu a ton. We've seen it regularly throttle down by 70-82% (measured with pmset -g thermlog)

Just saying, I definitely get the feeling Apple values form over function. His biggest issues happen when he's using Docker (the annoying Docker for Desktop version) but I also use Docker all day and dont have any issues...


Well, depends of course on your Windows Laptop. I had a Thinkpad X1 Carbon with no heat or noise issues at all. Now a Surface Pro 7 with also no heat issues. It's not the fastest around, but for a tablet really impressive (typical dev setup running Postgres in Docker, Goland, Webstorm with no problems).

Still I likely buy a M1 Air soon. Not because of the M1 CPU, but because it's almost half the price of a comparable Thinkpad. And I hate the bloatware shipped with basically every non-Microsoft Windows laptop.


"the fans go crazy, the laptop is hot to the touch and everything slows down."

So you installed Gentoo, what's your problem? 8)


A bit of a tangent but I am quite impressed how Apple's marketing campaign was so effective that they've managed to redefine what PC (personal computer) means


By being actual PC compatible hardware for about a decade.


What is your definition of PC? There are about 20 definitions in regular usage.


PC originally meant "IBM PC desktop clone". Apple was different because it's hardware wasn't compatible with PC software (for mainstream users).


"PC" was "personal computer" years before the IBM PC. Just look at old computer magazines.

The whole reason the phrase "IBM PC" existed was to differentiate it from the other PCs that already existed. "IBM" was the adjective. "PC" was the noun.

Because of its success in offices, "IBM PC" became just "PC" the same way other words like "omnibus" became just "bus" because it's simpler to say.


This is the correct answer - and PC originally, or "Personal Computer" is more of a distinction from old school mainframe computers that lived in a lab or wherever and took up entire rooms.


As I recall, the most widely used term early on was "microcomputer", to distinguish the small home computers from the larger "minicomputers" (e.g. the DEC PDP-11 of blessed memory). The term "personal computer" was also in use, but the use of "PC" was not common until after the introduction of the IBM model 5150 (whose actual product name was the "IBM Personal Computer").

Since then, you might refer to any of a variety of machines as "personal computers", but "PCs" only meant "IBM PCs" (or later "IBM PC-compatible machines"). In other words, I would argue that the term "PC" derives specifically from the "IBM Personal Computer", and not generically from "personal computer".

Source: I was there. :-) I haven't done the research, but I bet if you did search through Byte and similar magazines of the time, you'd find plenty of supporting evidence. (I do have a memo cube from the early '80s with the slogan "Apple II -- The Personal Computer", but I suspect that was Apple Marketing trying to fight an ultimately losing battle.)



Sure; I don't disagree (and perhaps my previous post was less clear about this) that "personal computer" was definitely in common use, particular in advertising aimed at "regular consumers" rather than hobbyists, and well before the IBM PC.

I do disagree with the suggestion that the initialism "PC" was understood to mean "personal computer" in general before the release of the IBM PC. If that's overly pedantic, well, I'm a computer nerd; what can I say...


Oh, OK, I think you’re right there.

I was there too if you need anecdotal evidence, my Dad worked in marketing for Apple - notable for being the ad manager for the 1984 ad campaign and you're wrong. I mean that ad was an attack on IBM PC's. The answer is on wikipedia too, https://en.wikipedia.org/wiki/Personal_computer

Wasn’t ‘home computer’ the generally used term before the IBM 5150 ?


I don't think so. Comparing use of these terms in books:

https://books.archivelab.org/dateviz/?q=home+computer

https://books.archivelab.org/dateviz/?q=personal+computer

Personal computer was used a lot more when books were eventually written about the era.


As far as I'm aware, the UK the phrase "microcomputer" (or simply "micro") was used before IBM-compatible machines.


Wasn’t ‘home computer’ the generally used term before the IBM 5150 ?

"Home computer" was things like the VIC-20 and the TI-99 4/A. "Personal Computer" was machines that you had in your home or office that you didn't have to share, or timeshare with someone else. Think Cromemco, Kaypro, PET, and SuperBrain.


The only reason I mentioned it is because OP's usage caught me off guard. Most of the time in colloquial usage I've seen PC used as an acronym to mean "computer with Windows installed as the OS", especially when being compared to Apple's products.

But in this case it was used as "desktop computer", which sounded strange to me as I consider laptops, however portable, to be personal computers.


I agree with all of that -- and regret the recent trend of using "PC" to mean desktop as opposed to laptop -- but what does any of that have to do with Apple's marketing?

You don't think Apple's marketing is behind the aforementioned recent trend; do you?


My understanding was that Apple's marketing created the initial deviation from the original meaning which led to the current one, but yes seeing what other people are commenting now it's not that simple and I was wrong on my assumption


> PC originally meant "IBM PC desktop clone".

You're misremembering history. It eventually evolved into that, yes.

Apple didn't really buy into that until the "I'm a Mac/I'm a PC" ads.


And even then they still used the PowerPC chips and branding.


Yep! Never thought about it that way, but indeed, PowerPC was a smaller not-server chip intended for personal computers.


The hardware is beautiful! As soon as it becomes trivial to delete macOS and install Linux (with reasonable support), I'm buying an M1 Macbook Pro. Unfortunately, I'm in too deep with the Linux ecosystem to switch over before then. Fortunately, Asahi Linux appears to be making fast progress, and there are even (spurious) rumors of official support on behalf of Apple Inc.


My experience is just the opposite, when I run even to projects on intelliJ on a mac m1 8gb, the whole system just starts lagging to the point that I cant even scroll the code.

This has not been a good experience for me.


That's probably because 8GB is not enough to run Intellij comfortably and the system starts swapping things in and out of memory constantly. Another thing that happens when intellij uses up its heap is garbage collects including frequent stop the world varieties of that. The solution is to configure it to have more heap. Either way, that's a memory problem and not a CPU problem.

8GB for a development laptop is not a great idea in 2021. I've been using 16GB since 2013. I'll get more with my next laptop, which may or may not end up being the 16" M2?


There was a bug with Android Studio (and I think IntelliJ as well) on M1, not sure if there still is.

You can fix it by setting 'Prefer Tabs' to never - https://www.reddit.com/r/androiddev/comments/jtbl4m/has_anyo...


The uncomfortable heat/fan issue is what ultimately pushed me over the edge to become familiar with Kubernetes. Now I just offload the compute heavy tasks to a desktop in a different room.


I think this is an Intel "feature". They can't upsell you faster CPU's without trading heat and noise. Glad to see Apple is following a much more sensible approach.


The only thing I've been able to do to get an M1 to spin up the fan noticeably and get warm is to max out both the CPU and the GPU by playing Minecraft Java in a huge world or running other 3D-heavy things.

The same things on an Intel laptop will turn it into what my friend comically calls a "weenie roaster" (for what it does when it's on your lap).

Things like JetBrains CLion don't even make the M1 warm. That will make any Intel laptop start cookin'.


Sitting in front of a laptop, the fan coming on at max speed makes me feel like I am stressing the machine and need to be gentle with it. When using the same machine through VNC I feel completely insulated from that effect. It’s a more pleasant feeling. I’ve considered getting a desktop and just using my Macs as remote terminals.


I'm eager to see what can be done with the desktop tower workstation post-M1. I don't think anyone actually wants to have a slim laptop and use "the cloud" to "download more memory" but here we are in 2021 where people are saying 16gb max ram is good.


>but here we are in 2021 where people are saying 16gb max ram is good.

No amount of memory will be good. The problem is not memory but inefficient programming. Desktop apps consume as much memory as possible. The only way to have a good experience is to have more memory than the average user. But then the average user catches up and you need even more memory.

Right now I have ms teams running in the background purely to receive notifications and it is using over 1GB of memory and 8% of my cpu to show me a notification window when I get a message. MS teams on my 3gb ram iphone does exactly the same job and uses no memory and no power because it doesn't even have to run to show me notifications.

I have 32GB on my laptop and I am only just above the average now but in a few years electron will have ballooned so far that 64 is the new minimum. While my 3gb iphone and 8gb ipad feel more than enough.


Tell that to my friends whose main softwares are chrome, office and vlc.


There is actually one reason to do a software development on a laptop: if I don't have a good enough place for work. Laptop is essentially a set of compromises: between computing power and battery life, between screen size and weight, between sense of pressing keys and thikness, between using mouse and forgetting to take one on the go.


M1 mac is also my first Apple product. And now i bought magic mouse 2 and fuck Apple. It is like laggy.


The default mouse settings are always too slow for my liking; I always speed up the tracking speed. I haven't used a Magic Mouse on a regular basis for a while (had both the v1 and v2), but once I updated the setting, it was as snappy as I wanted.


I already set tracking speed to highest but this is really one of the worst mouse i have ever used. Only reason i am still using it my other bluetooth mouse is seriously laggy when i use with m1 mac.


This is funny. For the last ten years I've used Mac laptops as my primary machine. Of course the fan noise was a great auditory clue to tell me to check out Activity Monitor/top for rogue processes and help me optimize which tasks took too many resources.

I didn't realize how much I relied on that cue until I temporarily switched to a Mac desktop machine during the pandemic when I didn't need to travel. The desktop fans pretty much stay at 1200rpm all the time. I think I've seen them spike once. I'd forgotten how much better desktops can be for some things.

Now I'm looking for a better remote access solution so I can consider sticking with a desktop at home and then maybe just an iPad for travel with good remote access and file sync.


I remember using hard disk drive sounds for disk usage and or capacitor noises to debug issues. But, I really like the MacBook Air M1 because its so quiet and it helps me concentrate much better and if you are talking to others or recording a talk it is silent. The other thing is that it has no moving parts so it should be very durable.


I used to use capacitor noises too when i was younger. I could tell other devs when builds were finished by hearing the high pitched noise stop. At the time i thought i could hear the cpu but recently learned it was the capacitors lol


Inductor coils? I haven't heard of capacitors creating noise before.


I use one too and there are still some moving parts. The keyboard keys, the speakers, the display hinge and the Taptic Engine. Those are the parts I'm looking at intently as the ones that will fail on me, especially the hinge.


The most likely thing to break on a computer is the fan itself. Apple has revised the display hinge and the keys have been redesigned. I don't know of any reports of the Speakers breaking or a Taptic Engine (other than people using Bootcamp on Intel MacBook Pros but that isn't an option anymore) but since there is no long term data on all of those components the only thing that I would question would be the redesigned hinge and the keyboard.


Annecdata: I've never had a desktop or laptop fan fail on me, and I'm the kind of person who keeps them for a decade.

Spinning HDDs (Toshiba bearings), keyboards and hinges on the other hand have all failed on me multiple times, in macs. but the fans kept going. The keyboards were the last thing to fail in the 2009 design, they last about 10 years.

Also while were at it, mechanical failure isn't always the biggest concern these days, Apple has had nVidia GPU issues (soldier and fab issues) in the past where they ended up underclocking them in a firmware patch in order to push failures out of warrantee.


I bought a cheap M1 Air model (only the ram is upgraded to 16gb) at $1030 Apple-refurbished. My intent is to try and sell it a bit before the 1 year mark.

For some of the reasons you mentioned (specially the last paragraph), but also the battery and the general non-reparability. It almost feels like a consumable item, compared to my previous Thinkpad.

Here's the thing: all the other laptops I was looking at cost at least twice the price. High-end Thinkpads, my second choice, almost 3 times.

All of the sudden Apple became the budget choice for my use case. In one year hopefully when I sell this one there will be competitors to the M1 chip (or maybe the M1X?), and I'll have more options.


> Here's the thing: all the other laptops I was looking at cost at least twice the price. High-end Thinkpads, my second choice, almost 3 times.

It shouldn't be this way, but most of the big enterprise retailers have ridiculous sticker prices. Unlike Apple, volume buyers get steep discounts, and individuals are expected to wait for a discount.

Ask around online and you'll find it's fairly simple to get fancy Thinkpads for half-off.


It's highly dependent on the country you're at. I see people on r/thinkpad get new high-end ones discounted, the kind of discounts that you can't get in France.

What you can find in France is heavily discounted ones from people that resell their company-provided ones. I bought 3 T460 like that in the past, for me and my family. But no high-end models, nor much choice in the customization.


It might be worth buying the discounted thinkpad in US (I assume that's where most heavily discounted ones show up) and then ship it to France?


I looked into that too ahah. There are "forwarding" services that do that for you, if you don't have a friend there.

But I don't like thinkpads to the point of doing that rodeo. For now, Apple won my money.


M1 is a complete SOC and discrete nvidia graphics from nvidia were so bad that Apple switched to AMD permanently.


That was just one example, one that affected me and millions of others. There are also plenty of other chip level failures in newer apple hardware as quite well documented by Louis Rossmann. Not all GPU/CPU issues either, there are many other chips that go wrong due to cheap component choices.

The GPU issues were both nVidia's _and_ Apple's fault due to assembly with low quality unleaded solder. In either case their attitude to customers was unforgivable, they replaced old broken boards with new broken boards until people just gave up or were pushed out of warrantee... and for the masses that couldn't be bothered to go through the pain of browbeating an Apple store employee into submission - they got firmware patches to push it out of warrantee.

Component failure, solid state or otherwise is not unique to Apple and is an inevitability, it will happen again - complete disrespect for their users by continually lying to their face on the other hand... That is something Apple is uniquely skilled at.


Fan life really depends on usecase. The bearings roughly have a certain number of cycles, and most consumers just don't push them that hard.

For example a tiny 1000 RPM server fan running 24/7 rotates 5 billion times per year. My desktop spends most of it's time off, but if I game on it 10% of the time it's big fans might be at 1000RPM. This is only 50 million cycles per year.


I'm also trying to figure out a solution for remote access/file sync. I've been using resilio sync to sync my files between my laptop and desktop. It works great, but it destroys battery life on my laptop, so I've been looking for solutions.

Syncthing is next on my to try list, but its very similar to resilio sync so I don't have high hopes. iCloud drive/dropbox/etc have issues with syncing git repos, so not sure about that either.


MEGA has been working pretty nicely for me also with Git. I use it on a macOS desktop and a Linux laptop. Only thing is the mobile app isn’t the most polished compared to Apple’s Files for example, but I don’t need it often.


Syncthing is great, however the biggest issue Ive found with thr Mac version is you explicitly need to end the task in the upper taskbar or it will just drain your battery. Also, if you sync a Mac with anything besides, be prepared to see a .nomedia file in EVERY.SINGLE.DIRECTORY. (i cant recall what exactly it is). It's very annoying to say to least.

That being said, Syncthing works great for a wholly cross platform sync tool. I have used it with a PC, Macbook, Android setup. Never have I not been able to get it to work.


To my knowledge, .nomedia files are created on Android to prevent the folder from being scanned for media. I don't think it's due to "anything besides", but Android in particular. I haven't tried configuring the Android Syncthing app to exclude .nomedia files, but it may be possible.


Then it's not .nomedia but some Mac specific file. I can't recall what it is since I stopped using syncthing on my Mac haha. I rarely use it unless I kinda have to.


IIRC, MacOS is well known for leaving .DS_Store droppings absolutely everywhere. If you ever shared a pen drive with a MacOS user, it would come back full of .DS_Store junk all over it. It might be that one you're recalling.

defaults write com.apple.desktopservices DSDontWriteNetworkStores true

defaults write com.apple.desktopservices DSDontWriteUSBStores true

Run these both in Terminal, and either restart Finder or just reboot. Should solve that problem.

The whole point of those files was to store Mac-related metadata (such as icon positions and other stuff) that the filesystem in question did not have the capability to store, to preserve Mac users' expectations.


I have been using syncthing across Mac, arm Linux, Linux, windows and android. Occasionally an Android release turns into a battery drain, but I haven't had an issue with mac (my Mac shares are about 70G). Could need a db wipe and reinitializing? I think their file watcher is native.


> Now I'm looking for a better remote access solution so I can consider sticking with a desktop at home and then maybe just an iPad for travel with good remote access and file sync.

This is my solution. I don't really need to do any file syncing as rsync or scp works fine for my purposes. It has been a fantastic dev experience. I remote in, bring up my last tmux session and pick up exactly where I left off, regardless of the front-end machine. I use an ipad pro for a ton of development now and it really is fantastic for it.

I've got a overkill desktop dev machine (previous gaming rig) that can make all the noise it wants in my basement and my front-end shell remains quiet.

I set up zerotier so I can constant access to it whether I'm on my local network or remote.


> I've got a overkill desktop dev machine (previous gaming rig) that can make all the noise it wants in my basement and my front-end shell remains quiet.

I want to have the better of both worlds and a nice uncluttered desk so I ended up buying an Intel NUC 9 module plus RTX 3060 that I keep hidden in a closet and game remotely using Moonlight and Gamestream. Everything works just fine on a Gigabit connection and plays well with the Xbox controller on the M1 as if its local. I can play everything on 1440p@60fps which is more than enough to me.


Can you tell me more about your setup? What apps do you use on your iPad for remote access? Is it just remote shell or do you do Remote Desktop too? What do you use for Remote Desktop?


Not the original poster but I use my iPad Air for a similar setup. Development environment runs under Docker Compose/WSL2 on my desktop, which I can start/stop using Chrome remote desktop. I've got code server and portainer in that compose file which are accessible from my iPad using tailscale, it makes for a passable development environment if I want/need to work away from the desktop for whatever reason


Can you not use rsync on an iPad?


is it better to have rogue processes silently wasting energy and creating heat?


I think this is a case where the meaning of the parent can be taken multiple ways, and one of the HN rules is to take the more generous case. My interpretation is that they are saying they see the purpose of needing some type of cue when CPU usage is high, but didn’t realize that they were using the fan speed as a proxy for CPU utilization until this was posted, at which time, they put 2 and 2 together. That’s not saying that the product is bad or not, just an anecdote.


That's pretty hilarious. Until recently I didn't think of the CPU fan as an auditory cue for how hard your system is working. But now I can remember times that excessive fan noise has prompted me to investigate the cause of excessive usage.


I had a habit of touching the strip of metal above the keys on my old macbook air, since that heats up before the fan becomes audible.

The sensible approach, of course, is to add a load monitor to the menu bar; now with my new m1 macbook, I can simply make the appropriate noises myself, as necessary. This is the UNIX way.


"MenuBar Stats" has a versatile collection of widgets to add to the menu bar. I'd prefer something open source though, if anyone has recommendations, please share.

Once you're running a tool like that, it is interesting to see the efficiency cores often saturated and the performance cores usually sleeping.


I've been using this one: https://github.com/yujitach/MenuMeters


In addition to that I also use

https://github.com/exelban/stats


You can do this with xbar [0] (which is also just a super cool app). Looks like someone even made one specifically for checking the CPU throttling speed [1]

[0] - https://xbarapp.com/docs/plugins/System.html

[1] - https://xbarapp.com/docs/plugins/System/cpu-thermal-throttle...


[1] seems to use 'pmset -g therm' behind the scenes and it seems to not work on my M1 Air.


What I really want is a menu bar widget that detects apps using 100% CPU for more than a minute and offers to `kill -9` them. Has anyone made that?


Hearing the disk going used to be an auditory cue too, which is now gone thanks to solid state drives.


Yes, I miss that calming crackle telling me that my PC didn't lock up and it's just chugging along. And the floppy boot sound. Objectively worse but still nostalgic.


Like the demonic scream of a dot matrix printer


Occasionally with catastrophic results https://www.extremetech.com/computing/239268-spotify-may-kil...

That said, I don’t miss the noise or slowness.


Oh, yes. This has been an ongoing problem with "sleep mode", which users think means "off", but isn't really. Some Windows docs for hardware makers indicated that "sleep mode" should stop the fans, to maintain the illusion that the machine is "off". Some machines do that, and some stick to temperature-based fan control, so the fans continue to turn if the machine is warm. Some users then complain that sleep mode doesn't work because the fans are still turning.


Wait, why would the device get warm during suspend? Or do you mean a different kind of sleep mode? I'm thinking of the state where it just refreshes DRAM to keep its contents (this does not produce much heat at all) and everything else is off: that definitely does not require the cpu fan to run because the cpu is not processing any instructions during that state.



I assume not get warm, still is warm from running beforehand.


Hmm if that were necessary, then all power states should keep the fan on for a while after running, so a device that was shut down would also keep the fans on and there would be no need to put in a requirements document that the fans must turn off to give the illusion of having shut down? To my understanding though, if no new heat is created (usually mainly by processors like CPU or GPU), keeping the fans on is not necessary in regular laptops or desktops.


And this is how I ended up buying a second Nintendo Wii. They turned off the fans and fried themselves.


I bought a Lenovo P1 Gen 3 a couple of months ago and the fans spin while the machine is sleeping.

I'd love to know if it's crappy thermal design from Lenovo, crappy CPUs from Intel, or Windows sleep mode being too demanding. Maybe it's all three?


It might also be naive firmware that expects the same thermal/cooling constraints when everything else is otherwise going to cool passively. Or even a design choice to actively cool so a quick wake will be in a better thermal state.


Depends on which A1, S2, S3, stage it is etc. And laptops differ from manufacturer to type and models.


Lenovo supports S0, S4, or S5 in their newest machines. I changed my settings to use hibernate (S4) rather than S5 (modern sleep).

That would be useful for me today. Systemd again gone into some of those inifinite loops consuming 100% CPU in one of its daemons that try to replace what already worked before (systemd-resolved) today, while I was sitting in the train without AC.

I only realized something was wrong when the pinebook got way too hot.


Funny. I spent a summer in 2009 in Vietnam, working on one of those old unbeatable Powerbook silver models, in an apartment without air conditioning. The fan would run at full power constantly, and finally gave up the ghost. There was no way to repair it where I was. So what I used to do was keep a bunch of Wired Magazines in the freezer, and put them under the laptop, changing them out every 20 minutes or so. The machine still works... but yeah, that fan noise is really a great subconscious reminder that you're using it too hard. Also a good way to know if your code just entered an infinite loop, long before any other warning comes up... and occasionally that clue gives you enough time to cancel a process before the input freezes.

[edit] Just had a cool idea, since this discussion is heading towards how subtly useful auditory cues have been in our workflow. What if different processes played fans at different pitches, so the total volume still added up? That could actually be really useful out-of-band state information.


With the introduction of M1, it's hard to find any use of paper magazines these days.


If I wait long enough, maybe I can sell my collection of 90s Wired for more than that stupid Mario 64 cart ;D


About the "sonification" of computer processes there is this experiment, recently shared on HN:

What Does the Event Loop Sound Like? https://medium.com/att-israel/what-does-the-event-loop-sound...


SE Asian humidity was brutal on those older Intel-based MBPs. Even with A/C, it's still so damned hot and humid that you don't have to push the rig too hard before the fans start spinning. And heat is the enemy of Li-ion batteries.


The humidity in the region was also a problem for earlier iPhones, the indicator stickers used to check if the phone had been dropped in water could change colour just from the ambient humidity https://www.cultofmac.com/13571/the-tropics-may-be-too-humid...


I did this in the Sonoran desert with my 15" MBP when the temperature was pushing 50C. Had a stack of those giant ice packs from some meal delivery service. Worked with the laptop sitting on the ice pack and a towel until it was completely thawed after 4-5 hours.


Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: