Looks like a cool project! However, I think that relying on hardware buttons as one of the main input sources is a bad idea. They are just not reliable and not engineered for continuous frequent use.
I miss the buttons, but there just not enough of them on the phone.
I have a (big) box of old mobile devices which could easily run Linux as well as a Raspberry Pi. I really should do something with them.
That said, Pinephone's site has some amusing text: "the PinePhone runs mainline Linux as well as anything else you’ll get it to run." LOL. It'll run anything that it runs. OK then!
It's not as nonsensical as it sounds: being an open hardware (iirc, with 2 exceptions) and software device, no one is locked out of doing/running anything they want to. So rather than the typical 'if you can jailbreak and then figure out how (good luck with that!), maybe you can do it' it is instead 'if you know how to write/port it and work with our well understood hardware, you absolutely can do it.'
A good example even on 'open' Android devices: you are typically stuck with a particular binary blob for the GPU drivers pretty much forever because there is no publicly available documentation for the hardware and therefore no open source GPU drivers are likely to ever exist.
I've tried this and after 30 minutes of use this actually is pretty natural. Pretty similar to learning the shortcuts for normal tiling window managers.
Capacitive touch screens are literally the most flexible and expressive interaction patterns invented to date. I can't for the life of me figure out why you would hamstring yourself by using the 2 or 3 hardware buttons as a primary input method
One reason might be accessibility: for some people, it's a lot easier to hit a hardware button than a tap-target on a touch screen. You can feel around for the edges of the button, you get clear force-feedback when it's pressed, and you don't have to worry about accidentally hitting something else nearby.
It would be quite easy to improve both the "accessibility" and the practical UX of capacitive touch screens by establishing a norm where every potentially-destructive action (anything that cannot be "undone" with ease) can be made to require a 'slide to confirm' gesture. This workflow is already used by community-made "recovery" environments for touch devices, but it would be nice if it was adopted more widely.
Sony Vaio laptops used to have a 'jog dial': a combination dial and button (basically the same as a mouse scrollwheel/button). This provided a very convenient method of navigation, and this looks like a perfect use case for it!
But I think an even better input method is the scrollball, and the small trackpad on later BlackBerries. I have the last one that had a trackpad (the 2014 Classic), and to this day I'm still amazed at how fast and precise it is. Rather than moving a mouse cursor like laptops, it moved the focus from control to control. Combined with a hardware back button beside it, you had full control over the device by moving your thumb slightly. It worked with gloves too! And desktop websites were not an obstacle because it gave you a proper mouse pointer that could precisely target small controls.
Nothing else comes close to it. The Classic had a small touch screen compared to today's phones, but I think a trackpad paired with today's phablets would be a great combination.
I remember the Rockbox firmware for MP3 players having something similar. I was (and am still) amazed how efficient the UI/navigation was in relation to how limiting the (hardward) controls were on some of those players. It took a little while to understand how it works, but that tiny bit of learning curve was well worth it.
I miss the buttons, but there just not enough of them on the phone.