Hacker News new | comments | show | ask | jobs | submit login
One Frickin' User Interface for Linux (2003) (anu.edu.au)
78 points by networked 1061 days ago | hide | past | web | 78 comments | favorite



Some time ago I read something that seemed a much stronger case against the Year of the Desktop Linux - lack of binary compatibility. It is hard to develop commercial software for Linux, when it means building your software for every distribution and supporting all of them. Anyone buying a piece of software would probably have to stop upgrading their system if they want to keep using the software. Different desktops add to this pain, but not as much.

Of course, Linux is not about buying software, so I guess not many people care. But the availability of a certain, professional piece of software (aka "killer app"?) is often what eliminates Linux as a choice for many people, in my experience. (That is, now that there are good graphics drivers available and you can watch bloody YouTube.) Ultimately, in a very ironic twist, this makes Linux most suitable for people who do not know anything about computers. I install them Ubuntu, they don't touch anything except updates, and everybody's happy.


Of course there is a twist to the binary compatibility story. Namely Linux (the kernel) works very hard to maintain userspace binary compatibility. Afaik you should be able to run some random 20 year old executable as long as you got the right libraries or have it statically compiled. Also considering that X11 (the protocol) is backwards compatible for quite some time, I'd say that cross-distro binaries in Linux is definitiely achievable thing if the application developer puts the effort into it. What is missing most is documentation and tooling to support such efforts.

In near-future I also see containers as a possible cross-distro packaging solution, but that also needs some tooling to make the process more pleasant. Also container-supporting kernels need to get more prevalent for containers to be truly a solution.


I did not know that, thank you. I guess the key is in the phrase "as long as you got the right libraries"? Kernel+X alone is not enough... There should be some simpler solution to this than deploying your app in a container, but I really don't know enough to even dare guess.


You can bundle .so's with the application (this is how many humblebundle/steam linux games work) or you can statically link those libaries.


Steam is probably the best shot in a decade at selling consumer software for use on Linux. It'll be interesting to see if it takes off.


It could have been but right now, 1.5 years after launch, Steam on Linux feels quite half-hearted. Steam Machines which were supposed to bring SteamOS (and Linux) to the masses have failed to materialize. Looking at the Linux game releases it seems like Valve is not providing enough developer support to make high-quality releases. Instead what I have seen, the games have been quite inconsistent from techincal point of view. Maybe half of the games (or less) in my library work at all on my Arch Linux systems, even after some minor tweaking. I wonder what they intend to do in couple of years when now-current Ubuntu becomes so outdated that nobody runs it anymore. I wouldn't be surprised if they end up shipping some sort of Ubuntu environment to run the games in.


The purpose of Steam on linux was never "to bring linux to the masses", and you're a fool if you honestly believed this story. It should've been obvious the moment they created SteamOS, rather than the first plan, to ship as an application on Ubuntu - it's because the purpose of SteamOS is, and always was, for Steam to liberate themselves from greedy platform providers. They're not about to divorce Microsoft only to jump in bed with Canonical.

The purpose of SteamOS is to power the eventual Steam pseudo-console, and to sit Steam at the centre of game distribution, rather than Microsoft - or rather, put Steam on par with Sony and Microsoft, instead of being a mere middle man.

SteamOS, for all that matters, might as well be a proprietary platform on top of Linux, much like Android is. Its users won't even know it is running linux, nor will they have root access by default - they'll be given an interface which is the primary portal to games (and "Apps") distributed by Steam

Anything "useful" for the rest of linux that comes out of Steam is only likely to be kernel additions, drivers, and other low level stuff that is merely incidental - required by the GPL of Steam, but nothing is going to solve the "problem" of cross-distro packaging, for instance - Steam will only distribute for SteamOS - it makes zero economic sense for them to do anything otherwise.


Agreed, I've bought several games from Steam since it became available on Linux, the process is easy and most of the games have been great.


In practice, the way to run old binaries on Linux is ... Wine.

So we have our long-term-stable ABI for Linux. It's just that it's called Win32. An ABI that will survive Microsoft.


What the author somehow doesn't get is that you shouldn't see Linux as a whole. You should look at different distributions like you look at Windows vs OSX. Different distributions are totally different to end users, although they share a lot of the core system utilities. Ubuntu or Linux Mint could be picked by a large corporation and everybody in that corporation would have the exact same interface. Normal people don't choose Linux, they choose a distribution.

This way, the normal people have the hard choices made for them and the crazy hackers like myself can do it all themselves on their Gentoo machine.


I don't agree. Desktops matter. I use the same XFCE version in Ubuntu and Fedora, and they look/behave the same. You need to get to install software to find differences.

I guess Unity is only available in Ubuntu, but that's just because not enough people want to use it in other distros.

Just one anecdote: my sister used to run FreeBSD in her computer and she always thought/referred to it as Linux. She was using Gnome 2, and it was indeed the same experience (not completely true, compiled packages were faster, kind of Gentoo result).


Of course Desktop environments matter, that's why there are so many of them. I would be able to use Ubuntu almost like my own machine if I would install my window manger and utilities on it. What I meant however is that the defaults of a normal user friendly distro are good enough for those users. There is no need for them to tweak stuff because their LibreOffice and Chrome/Firefox just work with those defaults. The choices are only there if you look for them and at that point you are not a normal user any more.


The desktop is no longer the battle that matters (more mobile devices are in use today than desktops and laptops, and the gap is widening). Linux has won the mobile market...and I guess one could argue it did so with one frickin' user interface (Android, though that's somewhat fragmented by the often-horrible, always-stupid, vendor customizations, like HTC Sense).

We will, I guess, get to see what happens when Firefox OS comes along. Then, again it's so very different from Android (and Android is so far from a traditional Linux distro), that I don't know if the OFUI theory will really be tested in that case, since there are so many other variables.

Generally, the "OFUI" idea is good as long as it's the OFUI that I like. I happen to really like Gnome 3, so I'm fine with it being the One True UI of my preferred Linux distro (Fedora). If I hated Gnome 3 (and I understand why people do...it had a rough and very buggy and very restrictive breaking in period that lasted a lot longer than it should have), I'd be pissed off at the suggestion that KDE (or LXDE or XFCE or whatever) should be removed completely.

Anyway, I don't find myself agonizing over how Linux (or any open system, as that's what I really care about; it wouldn't need to be Linux for me to be happy, as long as it's Open Source) can win the desktop anymore. Partly because open is winning on mobile (at least, in terms of the OS, though there are many dangerously proprietary parts of the mobile industry, including on Android devices), and that's an even bigger market. But, also partly because many of the reasons I needed Linux to win was because I wanted to use Linux for everything, and there needed to be a critical mass of users for hardware, software, etc. to be good on Linux. These days I rarely think about hardware, as it Just Works. And, I rarely worry about software, as there's tons of amazing stuff to choose from. Even games are pretty good on Linux, which is a relatively new change.

So, I want more Linux more of the time. But, I don't know if it has to "win" for all of us Linux users to win.


Have you ever used the NDK?

It is quite limited in the set of official APIs and POSIX support.

Google could change Android's kernel for something else UNIX like and almost no one would notice.


Why does it even have to be UNIX like? It can be anything as long as it can run the Android Runtime. Right?


Because the NDK uses C and C++, as well as, partial UNIX APIs. Nothing to do with the Android Runtime.


They'd notice the lack of device drivers.


Android is installed by the device manufacturer. You don't attach any peripherals to your phone. Even when you connect your phone via USB, it's the computer that does the read. So, I doubt the user will notice the lack of device drivers.


Device drivers are not just peripheral drivers. The baseband driver, the wifi driver, the kernel display adapter driver, the touch driver, the drivers to handle the hardware buttons, the PCH and FSB controller hubs, bluetooth drivers, bluetooth device drivers, and NFC drivers, at the least. You may also be missing accelerometer and thermometer drivers.


You don't talk directly to the device drivers with NDK APIs. So I miss the point.


No Fucking Way. This kind of mentality is what is resulting in systemd being foisted down everyones throat.

Multiple parts doing single tasks with loose couplings. That is what i came to Linux for.

If i wanted some kind of "One True Way"-ism i would have stayed with Windows or bought a Mac.


First, I don't grok the vocal minority's vehement opposition to systemd.

Second, loose couplings are great for those who want to tweak to their heart's content, people who know what they're doing. I used to tweak the heck out of Gnome 2. I grew tired of it and appreciate the simplicity and elegance of Unity and dislike the "we know what's best for you" attitude that Gnome 3 conveys to me. Unity just needs to have 3 sane defaults reverted that I've mentioned in other comments.

Third, you can still tweak Linux to use whatever desktop you want. Back when I was on Windows I even hacked the registry to make the DOS command prompt my UI as a joke and it worked. Booted, no explorer running at all. Just the Windows XP kernel, few processes it started and DOS CLI that was it. It taught me that most of the stuff users interacted with on XP was actually started by and run by explorer which explained why the system grew so unstable when it crashed and I had to restart explorer.exe from the Task Manager by hitting Ctrl_Alt_Delete IIRC.

Having a common UI/UX is great for a lot of people out there including those of us who know all about loose coupling and tweaking things but usually just want to do what we want to do without all of that tweaking. Except on those weird days on the week-end when we try new stuff out like a checking out how KDE is coming along. Still too complex for my tastes but it looks great!


There's nothing about loose coupling that implies poor UX. It's more about seamless interchangeability of components, which if you don't care about, will make no difference to you.

Trying to impose an ad-hoc diktat in the Linux community where you insist that X shall be the one true DE, Y shall be the single package manager and Z the service manager, does nothing but piss off people who actually care about Linux, and will do nothing to bring about the Year of the Linux Desktop, because most end users simply take whatever OEMs ship.

But there's a more destructive thing. By doing this, you're sending out a message: "Fuck experimentation. Fuck academic research. There is only one path and one dogma."

Now, if it was just about having de facto defaults, that's tolerable. The issue is that deep-seated standards and dependency chains are being created in the process, where someone who researches and implements a formidable alternative to a piece of system software will have to go through a ton of unnecessary and superfluous hoops just to stand a fighting chance, all because of the narcissism and egos of a bunch of people who thought that they could reinvent the world, and this time they'd get it right, dammit.


Loose coupling allows the "shiny thing of the week" folk to not wreak havoc on the "good engineering" folk.

When DEs like Unity followed Microsoft down the disastrous blind alley of "desktops should act like tablets", that was OK because the loose coupling allowed by defined protocols like ICCCM meant that I could wait it out in XFCE.

The fact that practically every distro has gone with systemd, and that the couplings are tight, means that I have to leave Linux entirely to wait out this disaster.


Agreed, the designers of X made some truly brilliant decisions.

http://en.wikipedia.org/wiki/Inter-Client_Communication_Conv...

Edit: I don't have a dog in the hunt on systemd but if it really is tight coupling then history might prove it was a bad decision. There was a lot of let's say strong opinions against Pulse Audio on Linux but I find (once we got through the growing pains) that it's a blessing. It's far superior to Windows 7's basic audio management in my opinion. I've no idea about Windows 8.


Everything PulseAudio does should have been implemented as part of ALSA though, not as another layer that only exposes a fraction of the libalsa functionality.


Wouldn't that imply putting a network-accessible server in the kernel? That seems a fair bit more risky than non-root userspace...


No, alsa actually does its most interesting stuff in userland.


You clearly haven't used Unity; it's basically a clone of the OS X 10.6 interface (before they changed spaces) built on top of GTK3. It's got a useful set of standard key bindings (hold SUPER for a cheat sheet). It would actually be pretty awkward to use on a tablet.

Gnome 3 OTOH is as you describe: giant touch-friendly menus, swipe to unlock (with a mouse), etc.

Your position seems a little alarmist. How does systemd mean you need to stop using Linux? I'm sure XFCE will continue to work.


Agreed about Gnome 3, but from the Wikipedia article on Unity:

"In July 2012, at OSCON, Shuttleworth explained some of the historical reasoning behind Unity's development. The initial decision to develop a new interface in 2008 was driven by a desire to innovate and to pass Microsoft and Apple in user experience. This meant a family of unified interfaces that could be used across many device form factors, including desktop, laptop, tablet, smart phones and TV. Shuttleworth said "‘The old desktop would force your tablet or your phone into all kinds of crazy of funny postures. So we said: Screw it. We’re going to move the desktop to where it needs to be for the future. [This] turned out to be a deeply unpopular process.”"

In answer to your question, systemd is in an excellent position to inject non-determinism into the functioning of the OS at all kinds of levels. I've got a hair-trigger response to things breaking and fixing themselves randomly, courtesy of a couple of years working with Windows. systemd has already demonstrated this behaviour, and I feel fairly safe in predicting that this will increase as systemd gains in complexity. Never to the point of being a major problem, just enough to be a persistent annoyance.

Major point being, however, that if it does turn out as I expect, there's going to be no avoiding it while still using the mainstream Linux distros.


My homelab (and VMs) have been on Jessie for some time now - systemd is the default and I really like it. It solves a lot of issues

Before systemd::

* is it checkconfig, configtest, etc to test this service config?

* Why the are you storing logs in /var/log/messages, /var/log/syslog, and your own logs?!

New features::

* Dependancies, systemd handles these nicely.

* Different types of services, from simple to one-shot (which is where your startup scripts are meant to go! The devs have thought of this!)

* We can do away with GRUB menus if you only have one OS (systemctl set-default rescue.target) - you can also set it in your boot parameters.

* journalctl - well, at least it's all going to one place now....


is it checkconfig, configtest, etc to test this service config?

checkconfig and configtest were just initscript arguments that exec()ed the relevant program's own sanity checking facilities. I don't see how this is relevant.

Why the are you storing logs in /var/log/messages, /var/log/syslog, and your own logs?!

Solved much more elegantly by the likes of daemontools and derivatives (s6, runit, etc.), many years before systemd.

Dependancies, systemd handles these nicely.

Hardly "new". Dependency systems in init daemons and service managers have been rolled many times, from the primitive need(8) facility to the LSB initscript headers (which almost all SysV-based distros had adopted) to OpenRC, Solaris SMF and so forth.

systemd's model is far more involved. It's also far more prone to races and dependency loops.

Different types of services, from simple to one-shot (which is where your startup scripts are meant to go! The devs have thought of this!)

The semantics for managing long-running services and doing one-shot scripts are different. You can use standard tools from util-linux to craft much of the execution environment that systemd gives you through its unit options, and then time these either through at or cron. Nothing compelling here.

We can do away with GRUB menus if you only have one OS (systemctl set-default rescue.target) - you can also set it in your boot parameters.

Um... have you actually read what the systemd rescue shell does?

It boils down to this:

      plymouth quit
      echo "Friendly message here."
      sulogin; systemctl default
Where systemctl default is like going to a multi-user runlevel or whatever other overlay/synchronization point/service group/milestone/term used in other systems.

Yeah, it's totally worth switching over because of this.

journalctl - well, at least it's all going to one place now....

See comment above about daemontools' solution.

There's also a ton of log indexers specifically tailored to high configurability.


> Solved much more elegantly by the likes of daemontools and derivatives (s6, runit, etc.), many years before systemd.

Then why is no one using them?

> Hardly "new". Dependency systems in init daemons and service managers have been rolled many times, from the primitive need(8) facility to the LSB initscript headers (which almost all SysV-based distros had adopted) to OpenRC, Solaris SMF and so forth.

Exactly. Isn't the point of systemd to bring the ideas from launchd (and SMF) to Linux?


Agreed, if I wanted "One UI to rule them all", I'd be typing this on OS X or Windows, instead of Linux + KDE

(Re: systemd though I don't dislike it as long as I can choose to remove it, I don't really care about it)


And the missing introductory question is : "why should linux aims on world domination ?".


I too wonder why, as a linux user, I should care how many other people use the system. It seems to me that we have a really good community going and if more people join in, good for them, but Linux isn't in any danger of fading into obscurity at this time.


You should care because a certain critical mass is needed to have a decent choice of properly supported hardware and software. There's no immediate need for worry on that front today, since we are well above this critical mass. However, the original article was written in 2003, more than 10 years ago, when things looked rather different.


> You should care because a certain critical mass is needed to have a decent choice of properly supported hardware and software

Except that if they achieve this by hoisting "one true UI" and one true whatever on you, you might find yourself having less and less choice...


Maybe for the newest hardware things are fine and dandy, but even for a laptop I bought 2 years ago, I have some pretty basic linux driver issues (mainly graphics cards). Battery management issues as well. If 50% of the population were using linux this wouldn't happen


So I can uninstall Windows.


> The core API for both Windows and Macintosh is procedural C language.

> The core API has to be a C language binding.

> A disadvantage is that the core API is in Objective C which is unfamiliar to most developers, but a straight C API could be generated, as Apple have demonstrated with MacOS X.

The core API for the OS X UI is definitely not in C. There is only an Objective-C API, and it is not a shim over anything. (It uses the Quartz windowing system, but that does not provide things like button widgets, etc.) There is no C equivalent of NSView, for example.


> There is no C equivalent of NSView, for example

objc_msgSend(objc_getClass("NSView"), sel_registerName("alloc")) ...

:)


Haha, ok sure. :)


Same thing on Windows. Since Windows XP many new APIs are exposed only via COM interfaces.

Only masochists use COM from C.


"Gnome 2 is the 1FUI for Linux"

"A major company needs to act as champion and enforcer of the 1FUI by bringing out a distribution that runs only Gnome apps."

I think starting from 2004–2006, with Canonical and Ubuntu, Gnome 2 was indeed on a good path to become the dominating UI for Linux. But then in 2010–2011 Ubuntu and Gnome parted ways, and the leading position of Gnome 2 was split into Gnome 3 and Unity, with some refugees going to Cinnamon and Xfce. And KDE is still going strong, too.


Can confirm, am an XFCE refugee. Never thought about it before, but that's exactly correct; I don't really know why I use this desktop environment except that every alternative is useless.


Canonical really messed up there. I can understand the maverick attitude but not when it trumps common sense. Gnome 3 was a mistake too and that was a major mess on the part of the Gnome people.


I've been using GNOME 3 on Debian Jessie for about a year now on my main work machine and I have to say that I'm very happy with it. I switched to it after having used a XMonad desktop for about a year, I think that says a lot.

What I like is the fact that everything just works (tm). Just as it used to be on Mac OS X. No need to fiddle with 10 volume control applets on your minimal tiling WM, none of them works well with PulseAudio. Same for the battery indicators, manually loading your desktop background, setxkbmap etcetera in your .xinitrc. You're basically forced to mix-and-match your whole desktop.

GNOME 3 comes with batteries included. Everything works out-of-the-box. Yes, the eye candy is a bit too much, but I think the Activities menu is very useful and productive. I can just use my computer, like I did on Mac OS X before. But now I the Linux desktop also gives me much better performance, package management and stability.

Only thing I miss on GNOME 3 is a XMonad's tiling system and keybindings. It's a shame they lost the ability to swap the window manager for your own.


So then the only remaining desktop is KDE, which is now the 5.0 series and is being adopted reasonably and is not breaking the user experience while improving what it cans.

Does that mean it wins?


Having one UI would be the death of Linux. I would guess that the vast majority of Linux users today (including myself) use Linux because of choice (i.e. I use i3, and I can't live without it), and removing that choice would essentially be abandoning those users.


I reckon Quantum OS, the Linux distro based on Google's Material Design guidelines, is trying to achieve something similar:

- http://quantum-os.github.io

- http://google.com/design/spec/material-design/introduction.h...

Personally I would like to see 1FUI as an intra-BSD desktop environment for all the BSD operating systems, most notably OpenBSD.


The site reads in my mind like Andy Warhol on computer UI/UX design.

I think Canonical has a clean design with Unity, just fix some simple things as default:

1) Menus should be per application to reduce user mouse movements. 2) All menus and buttons should have icons on again, these provide valuable visual cues and reduce eyestrain. 3) Return scrollbars to the normal ones seen on other desktops, remove the scrollbar overlays, they are confounding and confusing.


If I didn't have a choice, I'd have probably abandoned the Linux ship long ago, frustrated with Unity and GNOME 3. Instead, I had one, and I am sincerely happy of that.

This article makes it seem like people developing GUIs are at war just because, rather than they stick on (and work on) projects because they like them and want them. If you want something you should make it, right?

In addition, it's not like this is a huge problem, given that most people are going to abandon desktops soon anyway.


>> In addition, it's not like this is a huge problem, given that most people are going to abandon desktops soon anyway.

I don't see that happening at all really. The desktop is a theme that fits very well with a lot of computing needs. I don't see that ever changing.


What works is consistency between UIs. The QWERTY keyboard has won, for example.

Having used the Mac for so long, I am trained to use my thumb for cut, copy, paste -- command X C V. Using my pinky finger in place of my thumb is a painful amount of gymnastics for me using ctrl X C V on Windows/Linux. At the end of the day I need to use Windows and Linux (and Mac.) Maybe I fiddle with keyboard setting files in Linux or install that old key swapper program that came with Win95, but the reality is I am going to have to use ctrl X C V, even if they conflict with the control key on the command line.

It's like driving different cars. Most things are the same, but you get used to certain things even if they are somewhat awkward, and new interfaces are usually annoying. Driverless cars can solve this. And the same is true for user interfaces. The best UI is no UI at all.


Keyboards are completely pluggable though. It's just a character/text interface into the system, so the physical layout can be anything you desire.

For instance, I use this layout: https://www.kinesis-ergo.com/shop/images/1463/advantage-blac...

Cut/Copy/Paste is the same CTRL+X/C/V combination that you are using, but the fingers I use are my thumb and one of the three fingers to its immediate left.

With this particular keyboard, every key can be remapped on the hardware itself (no config files), and the physical keys can be removed and placed (almost) anywhere else on the board.

For full disclosure though, the primary reason I use this keyboard is that it allows a much more natural separation between my hands (8-10 inches), which allows my shoulders to rest in a more natural position. It's not perfect, but definitely an improvement over the "cram it all into a tiny rectangle" approach that most other boards use.


You can map caps lock to cmd on OS X and ctrl on Windows/Linux if you want to always have the key in the same place when you're using copy/paste etc. I find it's more comfortable than wherever ctrl is (though cmd is reasonably placed)


These types of posts are super silly. What he's proposing is literally impossible given free licenses. You cannot stop someone from forking your project and you cannot stop users from using the forks. All discussion of how nice it would be if there were no forks is irrelevant.


We HAVE a standard user interface.

It consists of some sort of Taskbar, some sort of Launcher/Startbutton and some application windows.

KDE, GNOME2, XFCE, Windows 95 up to Windows 7 all use this. OSX to some degree. It is all the same.

Am i asking to much if i belive every (non-technical) user should be able to wrap their head around this and get comfortable with this within a few weeks?


The user isn't the problem it's that fact that they all work differently. There's no unified API for launcher/startbutton, or task tray, etc etc.


I know that KDE and Unity have adopted appindicators for system tray functionality: https://unity.ubuntu.com/projects/appindicators/

I'm not sure what you want in the way of unified startbutton API. Even within KDE itself it has Lancelot, Kickoff, and Homerun, all providing a sliding scale of classical start menu to Unity / Gnome / Windows 8 style dash. That is one domain that has demonstrably not one ultimate answer, and I'm a big fan of the KDE attitude of making multiple choices available with the default being moderate (search by default, tabs for programs / recents / power / etc).


Which, if I recall correctly, was the initial reason for creating the Freedesktop.org project and the XDG specifications. I'd say that the success has been humble, but by FOSS standards quite well. It is rather GNOME-centric, though.


Isn't it great?

I tried getting used to the new Ubuntu way of doing things, which I've heard is based on the Apple way of doing things. I didn't care for it.


This irks me about Unity. I have to run several gsettings commands after a new install to get rid of those dumb overlay scrollbars and to return the menu to the window that owns it. I leave the control buttons on the upper left, I kind of like them there but I can't stand overlay scrollbars and really dislike having to move the mouse all the way to the top just to work with the menu for an application which is why I do the gsettings thing. It's also what turns me off about the Mac, one menu at the top shared for the active app.

Edit: I also run some gsettings commands to turn menu icons back on. Also, you can return menus to the owning application in the Unity UI, right-click the Desktop, Change Desktop Background, select Behavior tab and it's there.


Unity worked fantastically when it was called Ubuntu Netbook Remix, and it was run on an 800x600px display, as everything is maximised like a phone because there isn't any other space.

I agree in basically every other context that it is annoying/useless, but for smallish laptop screens it's still not bad.


This issue can be helped if we stop referring to low level parts as an "OS" like Linux or even GNU tools but refer to individual distros as OS'es, like Ubuntu or ElementaryOS, etc. This at least creates the impression of unification under one banner, which I think is what the article is asking for.


It's called Android.


It's ironic that Android isn't on the desktop, given how much greater access they would have to expand their brand.

If they made a single touch/click that pulls the top menu _all_ the way down, then it would be easier to use on the phone and on the desktop.


I suspect that the lack of 1FUI is also part of why high DPI support is lagging pretty badly on Linux. In some window managers/distros you can get something that sort of/mostly works for built-in stuff, but once you launch a third party application like Firefox it usually doesn't end well. I'm sure the Firefox devs would gladly support high DPI screens on Linux if there were a single standard way to do so. There isn't, though, and there probably won't be one anytime soon due to the fragmentation of the platform.


Well, high DPI support isn't something to rave about on Windows, although that may be for similar reasons (WinForms vs WPF vs whatever)


XFCE for the win.

I don't agree with this article. Have alternatives is good.


You could argue that the OP's preferred choice, NeXTStep, did win, and the loser was Linux.

I haven't used Linux on the desktop for years, but have recently been thinking about going back as Apple seems to be abandoning all of the things which got me to switch. Is it really so hard to make a stylish upgradable/serviceable laptop with a matte screen? That, and maintain the OpenSource packages that you (used to) ship like X11 and OpenSSL?


I think having choice of desktop environment is a good thing. However, I do think he has some point on interoperability.

Copy/paste for example, works differently in desktop apps vs terminals, etc.

File selectors is another thing. I like the KDE one and the file selector is the reason I ditch gnome altogether, but best would be if KDE programs used gnome file dialog when running gnome, and if gtk programmes used KDE for dialogue of user ran KDE.


The author has this backwards. If you want "one single whatever for everyone", you first need world domination. You need world domination because in order to have "one single whatever for everyone", you have to be able to control everyone in the world, so that you can prevent the situation that multiple creative people work independently on similar competing things.


Where did the idea of a "year of the Linux desktop" originally come from, anyway? I often see people joking about it or advocating it (Linus, even!), but I've never seen any sort of foundational essay or manifesto - does something like that exist?


It was always a joke, and the joke is on people like the author of this post, who treat linux as a tribe rather than as a useful tool to get your operating system of choice to interact with your hardware of choice.


I would start small; for instance, can you get every API to use the same foundation types (similar to Core Foundation on Mac OS X)? These kinds of changes would help programmers a lot and pave the way for more complex standards.


Sure, as long as it's exactly what I prefer at the present time.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: