Hacker News new | past | comments | ask | show | jobs | submit login
Linux Touchpad Like a MacBook update: progress on multitouch (harding.blog)
270 points by wbharding on June 23, 2020 | hide | past | favorite | 127 comments



Damn. This is the first open source project I've ever sponsored and been really excited about sponsoring.

However, I've been using Ubuntu on Wayland for the past 2 years and have thoroughly enjoyed it. Frankly, I'm not willing to move back to X and it seems like a step backwards to me to invest any time, effort, or money in X at this point, so I'm going to discontinue my contribution.

I was hoping my monetary contribution would be a small ($25/Month) way to push Wayland forward, but the approach Bill and Povilas are taking is doing exactly the opposite by making it more comfortable for folks to stick with X. I understand there is disagreement and apprehension among the community with the slow move to Wayland, perhaps not at the same scale of systemd, but I firmly believe it's the best path forward and that X should be left behind.


Povilas from the linked article here.

Having support in X (even if only in proposal stage) will allow much easier work in the toolkit and application layers. X and Wayland cover essentially al Linux users, so we will not need to estimate how many users will benefit from implementing gesture support in toolkit X or application Y. It will make convincing project maintainers easier.

This is really important, because maintainers of open source projects don't usually care about suggested features if they're not interested in them themselves. A new feature means additional work to them - discussing the design, reviewing PRs and handling of eventual bugs. Often the person who implements a feature disappears after the PR is merged. This makes the maintainers to view all feature proposals with a grain of salt. We need to have a convincing story of how majority end users would benefit in order to make the contributions easier.

Technical skill is not enough when contributing in open source. Politics are often as important.


Hi Povilas, thanks for the reply.

To be clear, you're saying that implementation in the applications/toolkits ecosystem is the barrier to a better trackpad experience on Linux and that by unifying the feature set of both X and Wayland, this will encourage that implementation among the developers of the applications/toolkits?

If that's correct, that is one angle I had not considered before. It's also somewhat unfortunate that it was not spelled out more clearly in Bill's post, but hindsight is 20-20 of course.


> To be clear, you're saying that implementation in the applications/toolkits ecosystem is the barrier to a better trackpad experience on Linux and that by unifying the feature set of both X and Wayland, this will encourage that implementation among the developers of the applications/toolkits?

Yes, exactly. AFAIK only Gtk right now supports any kind of touchpad gestures on Linux. Also, most of the users are still on Xorg and it's right now unknown when Wayland will be the default universally. So we would be in a weak position to demand maintainers attention, as there's little immediate benefit.


That's unfortunate. AFAIK GTK4 will be Wayland-only and out in a few months, and some bleeding edge distros such as Fedora already run Wayland out of the box, except if you have NVIDIA, because Xwayland is still crap under nv; but not for long, apparently there's some rumours and intention to improve the nv/xwayland situation, just nothing concrete out yet [1].

So yeah, betting on Xorg in 2020 is not a smart move IMO.

1: https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-G...


> AFAIK GTK4 will be Wayland-only and out in a few months

I think this is incorrect. See the following, X11 backend is listed among supported backends. https://developer.gnome.org/gtk4/stable/extra-configuration-...


Distros keep trying to move to Wayland and then keep coming back to X11 due to usability. GTK4 is not going to help that.


I share this sentiment and have also withdrawn my contribution. I remember reading X is in maintenance mode, and putting countless of hours into a plug-in for it because of some people’s refusal to switch to Wayland just seems futile and a waste of effort me, especially since, as they have mentioned, there already is a ‘happy path’. With the next LTS release of Ubuntu, both Fedora and Ubuntu will be Wayland by default. That’s what, something like 80% of all desktop distros in use?

In the end it is not my money so I cannot decide, but IMO they should focus on bringing Qt and WxWidgets up to snuff with GTK gesture support. That means that as long as people are on Wayland, gestures will work ‘everywhere’.


Same here. I'm withdrawing my contribution if they're just going to short-term solutions which will lead toward slowing down the migration to Wayland. This is an irresponsible use of funds.


Qt even already has QNativeGestureEvent stuff, seems like it's only wired up on macOS for now.

Also, what really sucks about Qt is there's no kinetic scrolling in classic qt widget apps (e.g. telegram-desktop). Or there is but every app has to enable it or something??


I think the strategy seems reasonable. It seems much easier for toolkits and application developers to feel like supporting gestures is worthwhile if they know that the large fraction of their users using X will get this feature. I think otherwise it can be easy to only want to implement things for the intersection of X and wayland.


Honestly the write up seems ok but does not evoke optimism to me. I can't help it but to me macOS and it's multifinger gesture support is not about how many gestures it supports or which actions it supports, but the fluidity of the execution of gestures with animations.

When you start pinching/unpinching fingers apart, the zoom happens in sync with your finger spacing. As soon as you start pinching, the application seems to receive with the event stream all the x and y coordinates so that pan and zoom can be done simultaneously in lockstep with the finger positions and spacing on the touch pad.

Similarly when you pull 2 or 3 fingers from the side to e.g. go back a page, the reveal animation starts, or an arrow appears, and you can change your mind mid gesture by reversing the finger movement and the animation feedback is helpful enough to tell you that indeed the gesture will be canceled.

I honestly hope that the developers interested in improving this in libinput and stack have the same understanding that gestures are not about recognizing the one-off side/pinch movement with a couple of fingers, but about low latency fluid interaction of fingers with visual animation feedback tightly coupled. The other (one-off) type of gestures where you swipe your three fingers to the left and then after a sampling interval the framework or stack receives a ThreeFingersLeft swipe event (with length or time or xy values) is IMO a very underwhelming experience.


I’m a long time MBP user who’s switched to Linux on a thinkpad recently, after the butterfly switch debacle.

I think the one gesture I used more than any other on my MBP was the 4-finger swipe left and right to switch workspaces. I would have a fullscreen iterm/tmux/vim session in each workspace - each tmux is a separate project with multiple panes and windows. I can't really bind a keyboard shortcut to switch workspaces because I have so many shortcuts already bound to tmux/vim/coc and other vim extensions, which is why it's so nice to be able to switch workspaces with a touchpad gesture. I could flip back and forth between workspaces effortlessly, without losing precious key shortcuts for development. It’s a fantastic dev setup.

Fedora 32 seems to support this now! I don’t think that F31 did, but on F32 I can do the 4 finger swipe to switch workspaces, and on my thinkpad it works pretty much as smoothly and as interactively as it did on my MBP - swipe slowly and the screen slides following your fingers, change direction in the middle of the gesture and the screen slides back - the whole ‘stream input events and the screen is an extension of your fingers’ as you describe.

The Linux desktop is getting a lot better. I don’t think I would ever have switched without Wayland and now they’re getting gestures right.


Three finger drag is the killer one for me. Once you're used to it, it's difficult to go back.


Three finger drag is also a killer feature for me, and unfortunately that very feature highlights a key difference in the way macOS and the Linux desktop are developed [1].

To get three-finger drag to work, you need a gesture right? So you report the issue to libinput. Libinput looks at it and says "well, this is more than just a gesture", and says "this should be implemented in the compositor". Now, this is desktop Linux, and now you need to get at least three compositor projects to agree that 1) yes it's their problem, it isn't a libinput problem and 2) to actually fix it in their compositor. Sigh. And what's more - this isn't even the end of the problem! This is just for dragging windows! Now you also need to get this for arbitrary drag and drop actions like moving text or stuff in your file manager, so now you also need fixes in your UI toolkits. There's at least two for that you'll need as well. Pile on top there that most maintainers in this web have never extensively used a Mac, and don't even understand why this is something people want in the first place. This kind of stuff is very hard to fix on the free desktop.

If the same discussion happens for macOS, at worst some manager who is the boss of both of the squabbling teams will get mad, tells them that the UX here is more important than your technical squabbles and orders them to sit down and fix it. To be clear: I completely understand how these layers of abstraction work on the free desktop, and this is much harder to fix. But I sure wish there was a stronger incentive for these abstraction layers to fix these cross cutting concerns.

I have some hopes pinned on recent progress for this [2].

[1]: https://bugs.freedesktop.org/show_bug.cgi?id=89999#c20

[2]: https://gitlab.freedesktop.org/libinput/libinput/-/issues/29...


I think a lot of this is especially bad right now because we're trying to unwind a lot of the bad system architecture that was imposed by X. A lot of maintainers are finding that they need to do things themselves that X used to do for them.

There are other issues on the horizon - if input preferences are no longer held in X (because that's not libinput's job and it shouldn't be) but are pushed up to gtk/glib/qt/kde, does that mean that whenever I use a kde app in gnome that it uses some random key-shortcut defaults rather than what's in my gsettings? That seems to be the case right now.

Also, I owned a multitude of MBPs through the years, from the first Intel Al-books around 2005 through to my last MBP, a 2015 model. I was heavily invested in the platform because it seemed like that manager you describe had the same sense of taste and the same engineering intuitions as I do. Well, that all ended with the butterfly-switch keyboard. It was a sad, angry, bitter divorce, but I left Apple and will probably never buy another MBP. The butterfly switches are gone, but I know now that that manager lacks the integrity and professionalism to manage the platform. It was more important to fluff Johnny Ive's ego.

That opinionated manager can giveth, and she can taketh away.


> If the same discussion happens on macOS, at worst some manager who is the boss of both of the squabbling teams will get mad, tells them that the UX here is more important than your technical squabbles and orders them to sit down and fix it.

That strikes me as rather optimistic. I mean, yes, in theory a company does have enough overarching management that it can force coherent action, but... the "Microsoft" in https://laughingsquid.com/organizational-charts-for-tech-com... is a thing. Maybe Apple is still sufficiently centrally-driven that it works there? Historical accounts certainly make it seem like they've managed their share of infighting, but I suppose by the time they went to market it tended to be dealt with.


Apple has had consistent three finger drag since what, 2012? At least from this perspective they seem to be doing alright.


Agreed. It strikes me as odd that Apple hides this ability deep in the Accessibility settings.


It wasn't always relegated there, and had a more prominent place right in the trackpad settings.

Probably related: right around when the setting was moved deep into accessibility, Finder started having issues with three-finger drag.


Because it changes the gestures for workspace switching / expose. They are no longer three-finger, but four-finger, thus more difficult.

I'm in the group that gladly foregoes three finger drag, if it means we get to keep other three finger gestures.


When did switching workspaces become four-finger? Meaning you can't use the three-finger option? Catalina? I have a MacBook Pro on Mojave and the trackpad accepts three-finger for the gesture of switching workspaces and the Magic Mouse has an option for two-finger.


When you enable three finger drag, all the other gestures will switch to four finger.

Disabling the tree finger drag is not enough to return everything back, you have to configure that in separate control panel.


Might just be that most general users don't have a need for that?


Same here. When I first saw that feature 2014 on my MBP back then I thought Linux will have that soon. Man was I wrong....But actually it should work, as I can e.g. tap 3 fingers on a link now here on a XPS to open it in a new tab.


I used it for a while and once again I keep everything on one screen and use command tab. It isn't difficult to go back.


It's not just moving windows around. It's all drag operations. Highlighting text in a webpage or console, dragging files between finder and other apps, dragging UI controls like sliders.


Isn't it 3 finger?

As someone who installed Linux on a computer for the first time 22 years ago (Slackware via floppies!), but have never used it full time for more than a few days before I'd jump back to Windows/Mac, I'm amazed more every year how "we're almost there" is a recurring theme.


It's a dropdown in Preferences. I have mine set to 3 fingers because my pinky is short compared to my other fingers. My wife has hers set to 4 fingers.

I forget which is the default.


I don't recall changing it when I got a new machine in December, but with so much stored in iCloud, I'm uncertain if that's default or not.


I'm amazed more every years how "we're almost there" is a recurring theme.

Yeah, but I think a lot of that was because so much dev effort has been wasted trying to make X11 suck less, and building infrastructure around X11. Between Wayland, libinput, and freedesktop, it feels like we're finally moving forward.

I don't think that the linux desktop will ever be 'mainstream' - there are too many issues with commercial software - but at least it can be a pleasant experience for the technically-inclined.


The 4-finger swipe is responsible for a huge number of family tech support calls. "Dad, all my windows are gone again!". Pretty much a constant theme.


Imagine if there was an undo buffer for OS events and one could just say, "reset workspace" and boom, everything is fixed. PriorArt.


Why not turn it off for them then?


It turns out that kids like to go into settings and click every button and change every thing. I don't really mind. I pretend they are learning something about computers.


They are! They’re exploring and being curious. This is one of the most important things to encourage. I’m so thankful my parents let me disassemble the VCR, computers, other people’s computers, etc. I did end up learning a lot and the curiosity has served me well.


I do this so much I just assign Ctrl+1/2/3/4 to switch instantly to desktops so I'm not having to move my hands so much.


I have implemented that gesture in Wayfire last year :) but yeah GNOME already had it for quite some time. KDE Plasma too, I think.


As someone who used MacBook for ~5 years I miss touchpad especially for switching between workspaces.

The closest alternative (for me) on Ubuntu (Gnome) I've come up with is Super+J/K for switching between workspaces and, as a bonus, moving windows between workspaces with Super+Shift+J/K. Feels almost as productive as on MacBook with touchpad


Have you tried this? https://github.com/bulletmark/libinput-gestures

I'm using three fingers swap for switching between workspaces. But I agree with the comment above, the animation is key.


If you use Wayland on Ubuntu (Gnome), the exact gesture you're talking about already exists, it's just vertical instead of horizontal. It's actually my most used gesture personally.


It worked on Fedora 31, but they did noticeably improvment in F32. If you are fun of Wayland I recommend to check Sway.


As someone using sway on Arch, is there a way to get these gestures? Or is it something specific to Fedora?


Should be control arrow key left or right if I remember correctly


Povilas from linked article here. The design of how libinput handles gestures is exactly like you describe - the application will get a Start event when a gesture is identified, an Update event when finger position change and an End event when the gesture completes. This will allow to implement the macOS gestures in the end. There are more complex details of course, but the basic point stands.


Hey thanks for all of your work on libinput. A lot of us Mac refugees are really happy to see how much better things are with the linux userspace now. You're making a lot of people very happy.

Crystal-clear graphics with Wayland, great new hardware available from the two big linux-friendly laptop makers (Lenovo+Dell), great touchpad software support with libinput... it's a great time to be on desktop linux!


I think you should thank Peter Hutterer for his contributions to libinput. I'm just a random developer who loves complex problems and free software and was also interested in trying to improve Linux touchpad support :-)


Thanks for responding directly. I had the same question in mind as the post you replied to, so I'm happy to hear the right goal is being set.


Haha, very true. At one point there was this perfection in the Ubuntu desktop where I could do wonderful things like pick up a window with the pointer and do keyboard shortcuts to move workspaces and the window would travel with me and stuff like that. It felt so natural that I didn't realize I was doing it till one day I upgraded and it went away. I suspect it was some emergent property of the subsystems that wasn't specifically written for because I don't think it exists anymore.

OS X is like that. So many of the gestures are so natural that you just do them without thinking and I only notice when they're gone.


Another nice aspect of this gesture that is easy to overlook is used in the scenario where you have two workspaces available but you are trying to eg. swipe left (to move right) when you are already on the rightmost workspace.

The contents of the screen will slide a few hundred pixels left, but since there is no workspace on the right to take their place, what slides in is just empty black space - which then immediately bounces back out and returns you to your screen.

The existence of this behavior very clearly communicates to the user that their gesture was received correctly but cannot be successful for some reason - far more eloquent than any dialog box with printed words or any sound effect.


Not just the fluidity of the gestures, but the overall resolution and responsiveness of the device. If I had to pick my top requirements for a trackpad, I'd pick:

1. Latency: If I drag my finger back and forth across the trackpad, I expect the mouse to match exactly where my finger is--not where it was 50+ms ago.

2. Resolution/Acceleration: I expect to be able to navigate my entire screen without lifting my finger and repositioning it. Further, I expect to be able to move the mouse one pixel by making a very small move on the trackpad (or by rolling my finger a little).

After those "table stakes" items are solved, then let's talk about adding gestures.


> As soon as you start pinching, the application seems to receive with the event stream all the x and y coordinates so that pan and zoom can be done simultaneously in lockstep with the finger positions and spacing on the touch pad.

I just tried this in Preview, Safari, and Chrome and none seem to actually do this.


I have experienced none of the fluidity you mentioned.

The UI on macOS to me feels frustratingly slow compared to XFCE. I've tried to disable animations, reduce motion and all the tweaks that can make it a little bit faster, but it remains being unacceptably sluggish.

And please abstain from downvoting unless you've at least tried XFCE and macOS enough in order to make an objective comparison.


The fluidity is about keeping activity on the screen tightly in sync with motion on the trackpad and having zero or no unexpected behavior or jank when e.g. reversing a motion partway through to undo it (say, starting to slide three fingers up to expose all your windows, stopping to peak at something, then sliding them back down to put everything back where it was—you can even stop in the middle of it to make the windows do a little dance and nothing goes badly wrong or far out of sync). It nails that stuff even on my aging low-specs-even-at-the-time 2013 MBP. It's not so much about raw UI speed which, yes, light Linux/BSD window managers still hold the crown for (outside dead or niche operating systems like BeOS and QNX or maybe even old versions of Windows, which actually hold the crown)


BeOS lives on in Haiku (https://www.haiku-os.org/) which is both still alive, and blows most "light Linux window managers" out of the water in UI speed. :)


I have a VM with Haiku and I can confirm it is indeed very fast.

I am no Haiku expert but the Haiku's window manager does not support window composition and does not use GPU acceleration, right? It is also not modular and cannot be replaced.

You can disable composition on XFWM and make it run a little faster on lower spec hardware. You can also use a WM like i3, Windowmaker or Enlightment.

IMO, XFWM is the right balance between functionality and performance.


The whole point of Haiku is a holistic approach to system design, so no, you cannot easily replace the window manager. You can, however, write "decorator" plugins which draw different window borders, write shortcut plugins for "tiling", etc.

It indeed does not use GPU acceleration; Haiku as a whole does not have 3D drivers either (but there are plans.)


It’s honestly pretty bad on their slower and/or older models - they’re pretty clearly developing for what their devs have, which is their higher-spec machines. If you go into an Apple Store some time and play with their higher-spec machines it’s pretty good.


True. But what happens when you run XFCE in one of those higher spec machines? the advantage remains.


Perhaps this is a UI preference difference. I like the animated views on OS X and the speed seems acceptable but there are definitely browser apps (like the Chase website) where things animate in slowly and I get frustrated. So perhaps we're just differently placed on the curve.


Does it? There's a limit to how "snappy" XFCE gets.

I think this is the "animations vs no animations" argument, maybe - I like them as they give context to my actions, but I imagine some people don't.


For nice widget animations you can install a GTK engine/theme. There is a deluge of GTK themes you can choose from if that is your thing.

There are many themes inspired on macOS if you are aiming for that kind of look.

http://reddit.com/r/unixporn has a lot of that stuff. You can just go and grab someone else's configuration and apply it to your system.

If you are aiming for window effects with no regards for performance, try Compiz. You can add as many window effects as you want.


Sure, but when you add those animations to your setup, do you perceive it as being slower?


The touchpads on Macs are, in my opinion, one of the biggest advantages they have. It's not just multitouch, it's also the sensitivity, the clarity between clicks and scrolling, and the general responsiveness of it. Force clicks are one exception: when I force click it's almost always by accident and I have never found a situation where force click was tied to useful functionality.


I think force click to look up the dictionary is useful, but that's pretty much it.


My Catalina install broke the dictionary app for me, so there are no dictionaries other than Wikipedia. This was one of the most useful features of force touch for me :(


You can reinstall macOS without erasing your data or apps. Maybe give that a try. https://support.apple.com/en-us/HT204904


I set up my dictionary lookups to be done with a three-finger tap. Much easier.


today I learned why my computer so often pops up the dictionary for no clear reason


I use a mac for work and have definitely spent some significant time working with it as an actual laptop.

I really do appreciate the smooth scroll physics. But I really can't figure out why every raves about the gestures. I only ever trigger them by accident. I think I eventually turned them off, but only after screwing up a few times and thinking that I might some day actually want to use them.


If you're already a heavy keyboard shortcut user (especially the control+arrow keys on macOS), the touchpad gestures might not seem as useful. However, they are intuitive and natural in a way that keyboard shortcuts aren't, in my opinion.


Yeah, fair enough.


I force click to Ctrl-Click in my IDE and makes it much more confortable to navigate one handed when combined with gestures for forwards and backwards movement.


After trying out my Magic Trackpad on Pi OS a couple weeks ago, and finding that experience to be somewhat frustrating, I found out about this project and immediately sponsored it.

Even on Windows, trackpad support with my giant Apple trackpad is pretty good. On Linux (at least with Pi OS and XFCE) I couldn't figure out how to get it to have any good acceleration curve or work at all like on my Mac or Windows laptop.

Scrolling was pretty smooth, but the cursor control was all over the place, so I had to pull out an old mouse for that day.


This is one of those things that mark where the Linux desktop experience falls short. I've been a trackpad user since 1997 when I bought a keyboard with a built-in trackpad for my home-built OS/2 system, and the superiority of the experience on Apple hardware has been a significant factor in why I've stuck with Apple as my primary OS since 2001. I've occasionally had to use Windows for work since then and the trackpad experience has gone from unusable to acceptable but it still hasn't caught up to the Mac and Linux hasn't caught up to Windows.


I've been using trackpads since Cirque launched the GlidePoint, using it first in OS/2 and later in Linux. I have never had problems with these things other than the touch surface wearing through, the things generally 'just work'. I tend to increase the acceleration factor so I can move from left to right in one fell sweep, use two-finger scroll, touch-to-click and sometimes double- and triple-touch to click for button 2 and 3. What are these problems I keep on hearing Apple aficionados talk about which make life on Linux unbearable? Is it fancy gestures? If so, there's tools for that. Is is the size of the pad? Well, just use a bigger one. For me it just works so I don't understand what the problems are, all I read is that 'the experience is bad' and similarly subjective claims. Can you give some objective examples where a Linux-based desktop fails to live up to expectations related to trackpad support?


I've been using a 'magic trackpad' for years now, it is the only piece of Apple-branded equipment I ever bought (by getting it for cheap from the 'net), in Linux, and have not had any problems with acceleration curves or cursor control. It use it with 1/2/3 finger click gestures to substitute for the lack of those other buttons. I configure the thing through xinput since I'm not using a 'desktop environment' (Xmonad, dzen, dmenu, conky and trayer suffice). What are those terrible problems which made you pull out your mouse again?


I was impressed with the momentum this project has been building and was just about to sign up to become a sponsor. Reading the update though shows their current focus on X.Org - as an otherwise happy Wayland (Mutter) user I'm going to hold off supporting this until they start making changes that will benefit me.


Povilas from the linked article.

It's not obvious, but adding touchpad gesture support to X will benefit you too even if you're not using X.

This will make our work of convincing the maintainers of other open source projects much easier. X and Wayland cover essentially al Linux users, so we will not need to estimate how many users will benefit from implementing gesture support in toolkit X or application Y.

This is really important, because maintainers of open source projects don't usually care about suggested features if they're not interested in them themselves. A new feature means additional work to them - discussing the design, reviewing PRs and handling of eventual bugs. Often the person who implements a feature disappears after the PR is merged. This makes the maintainers to view all feature proposals with a grain of salt. We need to have a convincing story of how majority end users would benefit in order to make the contributions easier.


i dont follow..


Say you have a widget toolkit that doesn't support touchpad gestures. The maintainers of that widget toolkit would be more willing to integrate this feature if there's support in both Xorg and Wayland compared to Wayland alone.


I think he is saying that if it is brought to X it will be easier to convince Wayland to follow suit.

We'll see if that's true, or if we will need another round of sponsorship to achieve the same thing with Wayland.


Same here. I’ve already been supporting to this project, but seeing how xorg is going to be their focus for potentially many months from now, I am debating whether to keep my contribution up.


With support for both Wayland and xorg, improving support in toolkits and applications make more sense, though.


The focus on multi-touch gestures is missing the forest for the trees.

If you look at the all of the next highest rated "features" they are not features but bugs. The reason that mac touchpads are so nice to use has _nothing_ to do with multitouch and everything to do with palm rejection, filtering, etc. It is a hard problem, analogous to and more difficult than writing multicopter flight control software. Having looked at the trackpad code, and fixing (in a very hacky way) some of the issues I was having, I have zero expectation that Linux laptops will get a mac like experience in the next couple years. Like all problems, it isn't a technical it is political.

Trackpad software and cut and paste are the two largest detractors for desktop Linux.


I don't have high hopes either. It takes a combination of hardware and software to produce a really good trackpad implementation. Apple has the advantage of total control over the hardware and years of research on the software. They bought FingerWorks (https://en.wikipedia.org/wiki/FingerWorks) in order to improve their touchscreen and trackpad products. Wayne Westerman spent years on the problem before getting acquired. I'd imagine many more person years of research has been invested at Apple.

That's not to say we can't do better and it's not worth making an effort. However, thinking Linux can match Apple in terms of trackpads is wishful thinking, IMHO.


The flip side is that Linux can leverage a lot of Apple's research into what works. We don't need to do usability studies, we already know what we want.


Using xinput set-prop on the touchpad to tune parameters doesn't really fix it.

I end up having to disable the touchpad and use the external mouse just to be able to type without accidental mouse jumping around.

#1 linux usability issue on laptops in my opinion.


Could you elaborate a bit on the technical vs. political aspect at play here?

I was kind of just assuming that the trackpad issue might have been, at least partly, because the hardware macs use for trackpads was special. Are you saying that's not the case?

Also, what about cut and paste??


> Like all problems, it isn't a technical it is political.

Could you explain how/why? Is it about specializing in specific hardware/firmware?


Povilas from the linked blog here.

In open source often no one really cares about a suggested feature or even a submitted pull requests. It means additional work for the project maintainers - discussing design, reviewing code and handling of eventual bug fixes. Often the person who made a PR disappears leaving all future bugs the problem of the maintainer. So naturally project maintainers are risk averse and often say no to new features. Things end up not being implemented, not because it's technically difficult, or there's nobody willing to do the work, but because the incentives don't align.

The above opinion applies to open source in general as I've witnessed during the past decade. I'm not describing any projects that are related touchpad or input that I've been interacting with lately.


This post was delayed due to HN posting throttle, yay.

----

Because of sub-system ownership and crossing boundaries between window managers and input handlers. Just getting folks to recognize the problem and agree on how and where it should get fixed is the major hurdle. When software is overly modularized it both prevents many classes of optimizations and it forces certain communication patterns. Conway's Law [1] in reverse, teams will be split across component boundaries and solving Desktop Linux issues will require a level of coordination I am not sure the community is up to.

Writing the if the statements isn't the issue, coordinating and agreeing on the problem is.

[1] https://en.wikipedia.org/wiki/Conway%27s_law


>Like all problems, it isn't a technical it is political.

Didn't the rest of this paragraph just explain how difficult a technical problem it is?


I'm curious how this will compare with https://github.com/iberianpig/fusuma/blob/master/README.md

I've also personally been using multitouch zooming and scrolling in kde for years now.

It will be cool to see a simplified, 'just working', desktop independent multitouch system for linux though.


With full gesture support we'll hopefully get more interactivity, so e.g. if you switch workspaces with a four finger drag it moves as you perform the gesture and stops if you stop moving your fingers. fusuma seems to only recognise the gesture and then run a command afterwards.


Exactly, gestures without interactivity are honestly not worth it in my opinion.


Depends on what "not worth it" means really. Even though gestures minus interactivity is far from how "proper" gestures feel, I'd rather have something rather than nothing (for example, it is still more convenient to 3-finger swipe to switch virtual desktop than a keyboard shortcut or clicking a button).


Ok, I admit that "not worth it" is better rephrased as "not the end goal". If the end goal is interactive gestures, one-off discrete gestures are welcome as an intermediate step or halfway solution.

Ideally the input framework would know if the streamed gesture was consumed real-time, and if not (e.g. no support for such interactice gesture in some program), the one-off event is issued.

This reminds me of current xorg libinput two finger scrolling / wheel event. Xinput2 is the relevant keyword but I am not sure exactly how it all fits in, only what I can observe: - applications that don't know about multi finger scroll/pan listen for and accept classic mouse4 / mouse5 events and interpret them to scroll in steps if relevant. As an example, xev x event testing uility is not xinput2 aware AFAIK nor are classic x or older gtk programs - applications can be xinput2-aware (e.g. eog Eye of Gnome image viewer, but maybe also any non-ancient gtk3 application as well), in which case they can scroll more directly (pixel-smooth) and with appropriate acceleration / smoothing / inertia (gtk-specific ?). In firefox there's an env var like MOZ_USE_XINPUT2 which tells firefox it can do this smoother wheel handling, not sure if it's required or automatic these days. To test received events including xinput2, there is utility xinput --test-xi2

As a closing anecdote, there's an interesting interaction bug I have experienced with xfce where both xfwm will react to Super+scroll (compositor-level full screen zoom), and the application under the mouse pointer will also react to the scroll up/down. I have not deciphered the interactions here but it depends on app under mouse cursor...


GNOME Shell (and KDE Plasma I think) already have this gesture on Wayland. Wayfire has it too, thanks to me :)


Our work on X will expose enough information to the window manager and the applications to implement workspace switching in the way you describe.

Fusuma is not integrated with the display server, so it's limited in what it can do.


It sounds like no matter how much money is put into this, the sheer variety in how graphics are done on linux is going to make this a neverending task. Others have rightfully mentioned that the syncing animations with the trackpad is the big thing that makes the experience better, and even with Wayland, you're talking about interacting with both the window system and the compositor, both of which can be different.

I'm not a huge fan of apple, but these kinds of small details are basically impossible to get right unless you've spent decades relentlessly deprecating old apis and consolidating them so there's only one fish in the barrel to shoot.


Since we can't see the source for MacOS or Windows we can't judge whether or not these are the same challenges that they also faced. Aside from the different display server protocols the challenges may not be unique to Linux.

Perhaps the money that went into supporting this feature allowed them to overcome these challenges with much more ease. Unless someone who is more familiar with the source of MacOS can clarify otherwise I think it would be safe to say that this is not a new challenge, perse. Just new to Linux.


As a long-time Linux on desktop user and tweaker, an early adopter of libinput under X, and an early adopter of Wayland, a historied-user of MBPs with nice touchpads, and Chromebook Pixels with glass touchpads, I just don't get it. There's still very little concrete information here that gives points to any solid net improvement for anyone using a modern Linux desktop (aka Wayland+libinput, which already support all of the original goals of this project).

If you read the first two blog updates, they're straight out of FAANG PM-speak land. I can just hear the words echoing from a PM who vaguely understands an amalgamation of customer wants, without having any of the technical context of actual customer needs, architectural limitations, alternatives, etc.

And this is exemplified in the nearly useless survey data. It is so poorly sliced, and presented that it's hard to know what was learned. There's no axis for X/Wayland, which X input driver was used, what DEs were used, etc. All of these significantly matter in the day-to-day feel of the touchpad considering how libinput is configured.

Frankly, and this blog post more or less implies it, Wayland and libinput address nearly all of the original goals, and the ways in which they don't are either bugs in the drivers/libinput quirks that should be filled and fixed in libinput, or are simply the choices of GTK/Qt toolkits or DE designers and are part of a bigger design discussion that could help establish better defaults across toolkits+DEs.

I just don't see why people are putting their hopes or money on this project. I don't see current, useful-to-Wayland problems being identified or worked on.

Finally, a tip, for wayland + gestures, consider `gebaar-libinput`. And a solid net improvement to libinput (and up the stack) would be "stop scroll events". It's the only thing keeping the scrolling experience in Firefox+wayland+webrender from feeling just as good as Safari on a MBP.


My pixelbook touchpad works incredibly well (better than a macbook IMO). Is there any way that work can be integrated into this project?


Also using a Pixelbook here. Interestingly last time I had to reboot for update it spent a long time reprogramming the firmware in the touchpad. I wonder if this means that Google develops the touchpad firmware or if they just ship vendor firmware. Either way, I agree it works perfectly and the correspondence between my gesture and the animation of the display is perfect.


It's so smooth, the lack of lag when you're scrolling is just perfect. The hardware helps, of course, but I have the same smooth scrolling on very dated hardware to which I installed Chrome OS, too.


With such limited resources, I wonder if just targeting wayland initially would get usable results faster.


Seeing as how multi-touch generally works (at the display server level in Wayland) - I think a backoort to xorg makes sense - it aligns incentives for maintainers to test and improve support in toolkits and applications.


Most users are on Xorg, so this will have the most impact.

There is also no single Wayland compositor they could focus on.


There are, in practice, 3 that anyone cares about - GNOME’s, KDE’s, and wlroots. I don’t know how much of their work is potentially portable across the three.

But while most users are currently on Xorg, the work going into improving responsiveness and suchlike is all going into Wayland. This project is really putting lipstick on a pig.


Also, compositors don't really need much improvement in this area, everything has been done ages ago.

It's all about the toolkits, and some great work is happening in GTK, e.g.:

https://gitlab.gnome.org/GNOME/gtk/-/merge_requests/1562 https://gitlab.gnome.org/GNOME/gtk/-/merge_requests/1117


I would settle for Chrome OS-like touchpad smoothness and gestures in Linux. The out-of-the-box touchpad sensitivity and scroll smoothness in Chromebooks with outdated hardware and something as low-end and old as an a Celeron N2840 is incomparable to anything I've ever managed to configure in Linux, multifinger gestures included.


I'm still stuck on default trackpad Y-scroll direction under Linux. I've done some hacky input stream filter fix, but it's not working in ALL applications. (dunno why...don't really care...)

I STILL feel uncomfortable using my thinkpad scrollpad in geany under linux, as it jumps and the scrolling isn't clearly indicating (through some animations i gather?) scrolling direction when I drag 2 fingers up and down...

Multi-touch, great!

...I just feel like basic behavior still appears to be an anti-intuitive, if admittedly equally logically viable (swiping the paper up or down vs swiping the scrollbar, metaphorically speaking)


Could be wayland/xwayland if you're using that. My first thought.


thanks for the helpful tip! but alas, I'm still on Xorg


wait, what? so if they are ignoring wayland wont default ubuntu users on gnome not get these benefits? seems odd to do all the surveys and research but not actually check what distro/compositor respondents use?


from the article, apparently Wayland display servers already support multi-touch gestures.


Ubuntu, as of 20.04, is using X by default.


One of the difficulties of working on any improvement in the free-desktop ecosystem is the difficulty involved in getting everybody on the same page. It's interesting to realize that the chops required to write the code is not necessarily the same as the experience required to push a project to completion. This of course shouldn't be surprising, the product management role is present in every high performing organization, what's surprising is that it's taking so long for the open source ecosystem to realize the importance of this.


Double Tap to Zoom. That's literally the only thing I desperately miss from my touchpad on linux. If anyone knows of a way to get this on linux, please let me know I'd be ever grateful.


In case anyone is unfamiliar, on a mac in chrome or safari (not firefox for some reason) on any article, a double tap with two fingers on any row of text will smoothly zoom the page so the text takes up the exact width of the window - effectively hiding the margins while preserving the formatting of the text (ie. not reflowing the text)



I'm on Ubuntu Gnome with touchegg and use three finger up/down for workspaces and three finger left/right for the history in browser and Nautilus. There's even a gesture to toggle tabs in Firefox like ctrl-tab. It's not as smooth as a Mac, but still better than Windows :)


How does Android do multi-touch? How do Chromebooks? Is none of that work portable?


Woo! Amazing progress, and great to see the sensible approach. Well done Povilas and Bill.


Unsure why the blog article itself is in an iframe?

https://public.amplenote.com/embed/TnE3JJDDy5QDo5aZAYtqyHz4?...

This plan has also has a huge flaw, that can be shortened to one letter: X.

    > Add gesture support to the Xorg display server
    > A "fun" fact: if all of this work was already done today, it would appear on Ubuntu as of next April's release.
I'm betting you a drink that next April's release won't have an updated X server package. I haven't been following ubuntu packaging too closely, but last major x.org release was in 2018, and the next one hasn't been discussed: https://en.wikipedia.org/wiki/X.Org_Server#Releases

I expect Wayland to be the default on even more systems by then, maybe even KDE could default to it? Though Qt has been weighting them down with the transition, there are less and less rough edges remaining.

So, this is probably going the hard way, for little at the end. It sounds like it caters more to the author's system configuration of choice (well, it's their money, though not really).

As the author said, Wayland has most of the bits in place, we mostly need to tweak toolkits into supporting this (Qt, SDL, wx, fltk, Gtk?, etc). Starting there would lead early results, and the code path might be reusable is X is worth implementing later.

On the other hand, X.org probably nears its final release, so if you want it to be burried with multitouch support, it's probably a good strategy to hurry.

Let's see who is left in the X ecosystem:

    * GNOME: moved on to wayland, mostly, though strangely not with Ubuntu
    * KDE: moving along, a lot needs to be done in Qt
    * i3: sway is mostly compatible and quite nice to use (typing this there)
    * XFCE/LXDE: not sure, I think the projects have merged into LxQt, which has wayland support comming along. XFCE itself meanwhile doesn't have a timeline for wayland, but I think it's on the horizon
    * Openbox/fluxbox/Windowmaker: you're probably not into wayland. But there's a myriad of Wayland compositors that caters to the same
    * firefox: starts to work quite nicely, has hw-accelerated video decoding now under wayland.
    * Inkscape: supports it as of the 1.0 release
    * Gimp: On the roadmap for 3.0 with the GTK3 port (same situation as Inkscape)
Tablet support is here (mostly), network transparency now has an alternative, albeit quite young, with waypipe ("network transparency" has never been a fundamental issue with Wayland, waypipe was bound to happen, I expect a lot more rough edges will disappear once 90% of the users are on Wayland).

The missing pieces of the puzzle now are probably xdg-desktop-portal+pipewire support in the web browsers, that I often see people complaining about (and GNOME having their own screenshot interface doesn't help). That is coming. And the last piece, unified color correction with HDR support, I don't know that much about it to comment, but I've at least seen some work on HDR support for sway.


Chromium is also still X-only, which also includes Electron apps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: