Hacker News new | past | comments | ask | show | jobs | submit login

One of the reasons iOS feels smoother than Android is because the render loop of the OS is decoupled from the app. The apps aren’t allowed to introduce jank, so if you’re scrolling a webpage and stuff is loading simultaneously, iOS will be way smoother. I think this is also why they can have such low latency on inputs, for example with the Apple Pencil which is much lower latency than the surface pen or the android stylus. I had a 120hz android phone for over a year, and while the frame rate when scrolling is slightly worse on iOS, overall the OS feels more fluid to me. On a 120hz iPad it’s no comparison.

I am speculating here as I don’t know for sure, but I remember iOS started as a derivation of OSX so this may be the case for macOS as well. So I think it’s not your imagination, it’s a different input and render architecture than windows or android.




Android has had a separate "render thread" for ages, though it's per app and runs in the app process. Some animations run on it, but many do not. You can't access it directly from within the app.

The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.


> The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.

This feature has been in the Linux kernel for ages[1]. Android and ChromeOS are based on the Linux kernel, and have had this feature for quite some time. This is nothing new.

[1] https://community.arm.com/developer/ip-products/processors/b...


So why does linux ux feel slower?


Because in many cases, it is. Unfortunately, there isn't as much communication between the GUI people and the kernel people in the Linux community as there is between those same groups at Apple Inc. Not to mention, there are multiple competing groups of GUI people in the Linux community making coordination across these levels difficult. Also, there are many competing interests working on the kernel who might oppose kernel-level optimizations which favor desktop usage at the cost of, for example, server usage. As a result of these and many other factors, Linux's desktop UX remains far less optimized when compared to macOS's desktop UX.

As with most of Linux's rough edges, however, this is trivially fixable if you're technical enough. Of course, that's exactly macOS's advantage. If you want the best experience on macOS, you don't need to be technical.

Personally, I run Linux with a tiling window manager and lots of custom keybindings and little scripts. Having used the default macOS experience a few times at work (on machines far more powerful than my dinky laptop), I can assure you that my highly customized setup feels far more responsive. On the flip side, it required a lot of up-front investment to get it to this point.


I never felt the UI responsive difference between Linux and MacOS but Windows (including freshly installed on powerful many-core machines when) is a different story. The number one reason I ever switched from Windows to Linux is the latter always feeling way more swift - UI responsiveness always remaining perfect and some background tasks also working much faster. And I never actually used lightweight WMs - only KDE, GNOME and XFCE. The first time I've noticed some slowness in Lunux was on RaspberryPI (4, with the default LXDE).


This.

I think the main advantage with macOS is that it's above all else a system designed to be used interactively, as opposed to a server, so they don't have to put out a configuration "good enough for most things".

I also run Linux with a tiling window manager on an old machine (3rd gen i7), and it flies. One thing that made a huge difference in perceived latency for me was switching the generic kernel with one having Con Kolivas' patch set [0].

I'm using Arch and sign my own EFI binaries, so I don't care about out of tree patches, but for Ubuntu users and similar who don't want to mess with this, there's an official `linux-lowlatency` package which helps [1].

---

[0] https://wiki.archlinux.org/title/Linux-ck

[1] https://packages.ubuntu.com/search?keywords=linux-lowlatency...


X-Windows: …A mistake carried out to perfection. X-Windows: …Dissatisfaction guaranteed. X-Windows: …Don’t get frustrated without it. X-Windows: …Even your dog won’t like it. X-Windows: …Flaky and built to stay that way. X-Windows: …Complex non-solutions to simple non-problems. X-Windows: …Flawed beyond belief. X-Windows: …Form follows malfunction. X-Windows: …Garbage at your fingertips. X-Windows: …Ignorance is our most important resource. X-Windows: …It could be worse, but it’ll take time. X-Windows: …It could happen to you. X-Windows: …Japan’s secret weapon. X-Windows: …Let it get in your way. X-Windows: …Live the nightmare. X-Windows: …More than enough rope. X-Windows: …Never had it, never will. X-Windows: …No hardware is safe. X-Windows: …Power tools for power fools. X-Windows: …Putting new limits on productivity. X-Windows: …Simplicity made complex. X-Windows: …The cutting edge of obsolescence. X-Windows: …The art of incompetence. X-Windows: …The defacto substandard. X-Windows: …The first fully modular software disaster. X-Windows: …The joke that kills. X-Windows: …The problem for your problem. X-Windows: …There’s got to be a better way. X-Windows: …Warn your friends about it. X-Windows: …You’d better sit down. X-Windows: …You’ll envy the dead.

https://donhopkins.medium.com/the-x-windows-disaster-128d398...


> like Sun’s Open Look clock tool, which gobbles up 1.4 megabytes of real memory!

It's funny to read this in an era when smartphones come with 6 GB of RAM to compensate for developers' laziness and unprofessionalism.


Almost all major distributions use Wayland as of today.


move on its 2021....


I use Plasma Desktop and it's been more responsive than macOS was, so I don't know.


On the exact same hardware linux always felt much faster to me, I remember doing stuff like resizeing finder windows vs KDE's dolphin, finder would be all janky and laggy and KDE wouldn't miss a frame



Yeah, my go-to for speeding up Macs is to throw Linux on them.


>the render loop of the OS is decoupled from the app

Can you elaborate on this? If you do too much work on the main thread in iOS, it's going to hang the UI. Isn't the main thread the "render thread"? Do scroll views have some kind of special escape hatch to get off the main thread for continuing to scroll if the main thread is blocked with loading the content?


I believe the point is that the "main thread" for your iOS application, is not the main thread of the OS. They're totally decoupled.


Err, same with Android? And every OS ever. That's just standard process isolation. Or am I misunderstanding something?


There’s some deep dive articles on the way the input loop works. But OP is correct, and the reason iOS feels smoother. Android has a lot more UI lag.


I don't think Apple has any special tricks for input loop.

Some Android phones really have input lag, but it is not caused by CPU load. For example, on my phone, there is approximately 100-150 ms lag between tapping the screen and registering the touch. The lag is not caused by CPU load, but by slow touch sensor.

I don't think Apple has any smart code optimization tricks. Either they have a faster touch sensor or just some people believe that if something is made by Apple then it is of a better quality.

Here is a comparison of input lag in games in iOS and Android [1] and it shows similar results: 80 ms between a tap and reaction on screen.

[1] https://blog.gamebench.net/touch-latency-benchmarks-iphone-x...


They do! See https://devstreaming-cdn.apple.com/videos/wwdc/2015/233l9q8h... and the corresponding WWDC session on them minimizing the input-to-display time.


This is off-topic, but I love that the title of the slides is "233_Advanced Touch Input on_iOS_04_Final_Final_VERY_Final_D_DF.key".


I wonder why somebody working at apple wouldn't just use git for that?


<laughs in Windows 3.1>


UIView animations do not run on the main thread, and will not be blocked if the main thread is blocked. This does help a bit with keeping the OS feeling smooth, but it is far from the only reason.


This is not entirely accurate. iOS indeed uses a client-server UI model, kind of similar to X11. Along with submitting “widget” hierarchy updates, it also supports submitting “animations”. The downside is that the animation states are not truly accessible by the actual app after it submits them.

The scrolling animation is 99.9% of the time implemented as a client-side animation timer submitting non-animated hierarchy updates to the server. It’s common to have janky scrolling.


> Along with submitting “widget” hierarchy updates, it also supports submitting “animations”.

Is that how all these third party iOS apps all have completely consistent “pop” animations on long press?


No, that would be OS-provided GUI components (or sometimes manual reimplementation), similar to how most win32 apps had the same right-click menu behavior.


Off-topic: I see the lack of standardized OS-provided GUI components in Unix/Linux distros as the main root-cause for low Linux adoption. I'm assuming there isn't such a thing since I haven't been able to notice any consistency ever in the GUI of any Linux distro and/or any GUI app that runs on Linux :P But I may be totally wrong.

On-topic: they should build an M1 app that simulates loud coil whine at low-CPU usage, so you can feel like you're using a Dell XPS.


Well, there are common components, it's just that there are multiple standards (Qt, GTK, wx, etc)...

I'm using a tiling window manager with mostly GTK apps, so pretty much all menus and such look the same. The worst offenders are Firefox and IntelliJ, although they have improved a bit lately.

However, I'm not sure that this is the reason for lack of adoption. Windows has always been a patchwork of interface design, every other app having their own special window decorations and behavior, including MS' own apps (thinking of Office in particular). Also, seemingly similar elements, such as text inputs, behave differently in different places. For example ctrl-backspace sometimes deletes the previous word, sometimes it introduces random characters.


The unique thing about this is that it takes a widget from the app, and expands it out while blurring the rest of the screen. So it’s not just an OS gui component.


Blurring happens inside the UI server process. Here is a related technique in macOS: https://avaidyam.github.io/2018/02/17/CAPluginLayer_CABackdr...

Basically, it's like an iframe — you declare what contents this portion of screen should have, and the render server composes it onto the screen. The render server is shared between all apps (just like in X11), so it can compose different apps with dependent effects (like blur). Apps don't have direct access to the render server's context, just like a webpage doesn't have any way to extract data from a foreign iframe.


I wonder what made apple give the iPad Pro such an awful screen though, considering all the software optimisations that they do. I got the M1 iPad a month ago, and the screen has absolutely horrendous ghosting issues, like, what is this? An LCD from 2001? Just open the settings app, and quickly scroll the options up and down - white text on the black background leaves such a visible smudge, it bothers me massively when scrolling through websites or apps in dark mode, it honestly doesn't feel like an apple product to me. Honestly haven't seen this issue on any screen in the past 10 years, and here this brand new(and very expensive) iPad with 120Hz screen has something that looks like 50-100ms gray to gray refresh time.


This is a very interesting anecdote considering the m1 IPad Pro is supposed to have one of the best screens available on any device that form factor, the same XDR display technology as their $5000+ monsters. Every reviewer has mentioned the display as the selling point. I have looked at them in person and have debating buying one, but Next time I’m at an Apple Store I’ll want to see if I can replicate what you’re seeing.

You might be experiencing the mini-LED effect where the back lighting is regionalized, which isn’t ghosting but can be noticeable.


It's on the 11" model, so definitely not a mini-LED issue.


Edit: the parent commenter does not have a miniLED iPad.

If you have the 12.9” M1 iPad, the ghosting is likely due to the new miniLED backlight which was introduced with that model.

This backlight is not static, but tracks to content in order to increase effective contrast (similar to FALD backlights on higher end LCD TVs). If the backlight does not track the LCD content fast enough, there can be ghosting.

In addition, since the LEDs are large compared to pixels, you can sometimes see haloing, particularly in small bits of white on a black background.

Overall, while the display is great for video, especially HDR video, it has some problems with UI and (white on black) text.


It's on the 11" model, so regular old LCD model, not the new fancy microled.


No issues on my iPad you might want to take it to Apple.


I’m guessing it must be an issue with those new miniLED screens, or some other significant generational problem. I have two older 12.9” iPad Pros, one 1st generation and one 2nd generation (the first with the 120hz display). They are both excellent displays with no such issues with motion or ghosting.


I’m worried the upcoming upgraded MBP will only have this option. Although I read they released an update that should minimize this issue. Have you tried that?


To this note, I've noticed huge discrepancies in display quality in iPads. They were passed out in my last year of high school, and each one even appeared to have slightly different color grading. The whole thing was pretty funny to me, especially since I have no idea how artists would be able to trust a display like that.


Display on microled Ipad pro is Apple's first more or less own display. The panel itself is LG's I believe.


Recently I visited the Apple store and compared the latest 11 inch iPad Air (A14 processor) and iPad Pro (M1 processor) models side-by-side. I loaded up Apple’s website in Safari and scrolled. The Pro is noticeably butter-smooth, while the Air stutters. My iPhone also stutters in the same way, but it’s never bothered me before. It’s only after looking at the performance of the M1-driven iPad Pro that I knew to look for it. And I previously had a similar experience noticing how much smoother my iPhone was than my old Android phones!

I don’t know for sure the processor is the difference, this is just a report of my observations.


That's not because of the processor. It's because the newest iPad Pro has a 120hz display and the other devices that you were comparing to have a 60hz display.


While 60hz screen refreshes are less smooth than 120hz, the small (but noticeable) of a difference due to refresh rates wouldn't be correctly describable as "stutter".

The M1 processor makes a real difference. I have both the M1 and the most recent non-M1 iPad Pro's.


Yeah you really don’t notice 60Hz scrolling until you try 120Hz scrolling. It’s a bit like you didn’t notice Retina displays until you tried one then looked at a pre-retina display. It’s crazy how you adapt to perceive things once you see something better/different.


Maybe I should hold off on the iPad Pro until I can get 120Hz on all my devices.


This is not really true, at least in any sense that really differs from other OSes. And, if you watch closely, you will notice that poorly-behaving apps running in the background can and will introduce jank in the foreground process. Since iOS lacks good performance monitoring, this (along with battery consumption) has historically been the easiest way to figure out if an app is spinning when it shouldn't be.


I'm sorry, I have an Samsung Galaxy S20 Ultra with 120mhz. Using my girlfriends iPhone 12 makes me dizzy.

120hz is something you don't notice much when you enable it, but you definitely notice when it's gone.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: