I personally support the work of the Haiku project and have not had as many issues as the writer of this article. Maybe if so many sites were not bloated with garbage, the web browser would not have fallen over. Sites should have a fallback mode in case of situations where js is not turned on or the browser is not HTML5 compliant. As for the email client falling over, I think there is more at play than the author is letting on.
Far too much of this navel gazing when reviewing products from the tech press these days. Like the iPad being championed as a work device just because you can write up a blog post and a tweet on it and a complete disregard for any profession that isn't accomplished only via tweets and blog posts.
Hilarious a tech blog site being held up as a basic challenge for a web browser. IIRC the early WebKit team had extensive hacks and work arounds to make sure the abysmal code of major sites like this and newspapers actually rendered "correctly".
I've been trying out Haiku for several years. I have to say that it brings back very fond memories and is astoundingly feature-complete, but the limitations of its architecture is beginning to show: the system has no in-built notion of security and lacks even the most basic primitives for such purposes insofar as it isn't multi-user and has no notion of access privileges. The architecture is POSIX-complete but without users it is surprisingly limited.
Of course there is no real chance of extending the architecture to 64 bits or of porting it to non-x86 architectures.
Haiku is a great effort, but it's a requiem for what BeOS could have been.
EDIT: it seems that there is a 64-bit version I was unaware of.
The main thing that was holding back the next release was the infrastructure around the package manager, but most of the development hurdles with this seem to be sorted. There's some ongoing work to build packages before the beta release, but I can't see that taking the rest of the year.
Many people don't want to even try nightly builds, so they try the last many-yeas-old release, see issues many of which are already fixed, and write it off.
Were I them, I'd only block a new release on the ability to do manual and automatic updates from the UI. From there everything else is gravy.
By iterating on ideas away from user demands, you can more quickly make progress with less concern about breaking changes. However, this period is only really useful whilst the fundamental parts of the software are getting implemented. Some would argue that package management is a fundamental part of an OS, others would not.
In other words, I'd expect that you'll see more regular releases after the first beta is out.
Which is why the story of BeOS is a story of lost potential.
The bigger problem, these days, is finding a floppy drive that still works, and a floppy disk. I had to install MS DOS 6.22 on two computers about two years ago, and finding three floppy disks for the installation media was rather challenging. ;-)
Today, I don't think we need another under-developed desktop OS; if anything, pouring more resources into Linux to get something that's actually competitive with MacOS and Windows instead of just semi-competitive would be a better use of time. Alas, after 20 years of waiting, I feel like that's unlikely to happen either.
Rewatching late 90s BeOs demos made me really emotional. The actual performance, the mindset, the aesthetics so much was right. Felt better than my latest linux on a 2007 dual core.
A better solution would be for GCC to provide means for working with binaries and libraries produced with older versions of the C++ compiler, ideally including 2.
Perhaps a more realistic solution would be to create "proxy" libraries for the previous GCC versions that all they do is to redirect the calls to the latest version.
Problem is, when I actually need to get work done, it's easier to just use Windows 10.