Hacker News new | past | comments | ask | show | jobs | submit login
Ubuntu will switch from X window server to Mir (ubuntu.com)
327 points by dz0ny on March 4, 2013 | hide | past | favorite | 331 comments

Maybe this is a good idea, I don't know about X/Wayland enough to say. But it worries me that Ubuntu is increasingly striking out on its own. What I like about the GNU/Linux ecosystem is that a lot of distros share a lot of common underpinnings, and everyone benefits from a large community fixing bugs and improving those underpinnings. It's also less knowledge to have to keep in your head for system administration stuff. (Which is still necessary in Ubuntu, regardless of what the "it just works for me" people say.)

Maybe this is the kick in the pants Linux needs to increase adoption. But I would much rather know GNU/Linux, not Ubuntu. Now Ubuntu is standing alone with Compiz, Unity, Upstart, Launchpad, and Mir, all pretty fundamental pieces of the core system. In a decade, will switching from Ubuntu to Debian be as big of a culture shock as switching from Windows to Linux?

In terms of user interface have those standards delivered?

I remember fonts being really screwed up for a long time. Longer that it should have been. Just getting anti-aliased fonts was major deal.

Then graphics drivers are a mess. I blame manufacturers of devices here. But still treating it as a black box I just see that is inferior. I know Linus flipped of NVidia but at the same time is a chicken and egg problem. Why should they bother to improve their drivers? To make GNOME 2 desktops look shinier? You criticized Compiz, ok, what is the alternative?

Google did pretty much the same with Android. We have surfaceflinger, dalvik, custom IPC mechanisms. Now they are trying to slowly merge some of them in.

> Maybe this is the kick in the pants Linux needs to increase adoption.

That is what I think. Here is a company that tried to push and everyone resisted and said "you must adhere to standards defined by a 20 year old plus graphical server". I think that is silly. Let them innovate. I for one, am excited for them.

> Let them innovate

If Canonical were good at it, everybody would be saying "yeah, go for it!"

But they're not.

Every time they strike out on their own, they seem to end up producing overly complex, badly performing, messes, with tons of issues, that everybody hates. bzr, unity, ...

The only reason people use any of their stuff is because ubuntu pushes it on their users, and people with little experience end up using it by default. Sound familiar...?

[Worse, with Mir, they seem to going for the lockin effect: "mir will integrate especially well with unity"... >< ]

That perception, that Canonical is not competing on fair terms—almost anything they come up with, no matter how awful, will come with a built-in user base—is one of the big reasons they get so much flack for this sort of thing. If it were just some guy in his garage hacking on his new display server after lunch every day (like, you know, Wayland...), and gaining users the hard way, by being better, then nobody would blink an eye.

For that reason, I think people expect more maturity from an influential company like Canonical. They expect them to play nicely and work with others (e.g., if they think they've identified issues with wayland, work to fix wayland; or cooperate in designing a next-generation replacement) instead of striking out boldly on their own every 15 minutes.

Again, if they had proven really good at it, they'd get a pass to some degree—but they haven't. Instead, by making technical merit less of a factor in determining what becomes widely used, they're effectively reducing the overall quality of FOSS. Can you blame people for being annoyed?

[if Canonical were] gaining users the hard way, by being better, then nobody would blink an eye.

Maybe they are "being better".

I have no doubt that technically Wayland is better than Mir, but for me (as a user who lost interest in configuring Linux back when FVWM95 was considered interesting) I don't care.

I use Ubuntu (including Unity!) because it works. I never, ever want to deal with weird configurations to work around some bug, and for the most part Ubuntu lets me avoid that.

For me, "better" means "it just works". No other distro does that as well as Ubuntu.

(BTW, Unity isn't that bad once you've used it for a while)

I agree. I for one, also like Unity. My mother who is in her late 60 also learned it quickly now likes it. Interface is less cluttered and easier to use.

At first there was a small learning curve, but I don't remember often thinking "hmm, I wish I had my GNOME 2 environment or my KDE environment back".

Maybe I too got old and spending hours tweaking my GUI settings just stopped being fun. I just want to fire up the machine and get work done.

Can you elaborate on what you dislike about bazaar and Unity? Or how Canonical is able to "force" a vcs on its userbase?

I've used Unity since it was released, never had any issues, and generally think it's the best GNU/Linux window shell around.

Bazaar is also ridiculously simple, and I've started using it as a front-end to every other vcs. The support for git is a little rough, but I can do in one command what git users have to do in four, so I deal with it.

I think that the vocal minority of Ubuntu power-users (which is sort of a laughable term) hates change. This is what made Unity such a "failure" when really it's a pretty great system.

Launchpad supports Bazaar only. That's one of the "top ten reasons" to switch to Bazaar according to the docs [1].

That said, I don't really think Canonical is "forcing" anyone besides playing the "defaults" card in the distribution they control, and even then you can simply not use the stuff is installed by default (although I think that making the desktop dependent of some applications that I may not want to use is a little bit against the bazaar philosophy of Linux distributions).

[1] http://doc.bazaar.canonical.com/migration/en/why-switch-to-b...

EDIT: formatting

Launchpad, like Github, is offering a service and that too free. It is not a valid argument. It would be like asking Github why don't they support other VCS.

> making the desktop dependent of some applications that I may not want to use

Examples please.

I don't know if it is a valid argument or not, but according to Bazaar docs it's one of Bazaar's strong points.

AFAIK the official Lanuchpad FAQ says Bazaar is the best VCS for the project:


And apparently they can (or want to) support only one. Fair enough.

But github does support other VCS!

Really? Link please

Supporting parts of subversion client is not same as subversion repository.

I don't know if this constitutes a "power user" usage pattern, but watching a colleague try to work with Unity with dual monitors was hugely amusing.

I'm curious to know what you can do in one command in bazaar that you need 4 in git to do.

Here's my power user setup for dual monitors with Unity:

1. Plug in dual monitor to HDMI port. 2. Use Unity.

What did your colleague have trouble with?

(Four might have been slight hyperbole, but git is a usability nightmare. I forget what exact task I can do in one command that it takes 3-4 to do in git, there's probably some set of absurd arguments to do it in fewer, but I don't know them and am not willing to spend hours reading tutorials to discover what magic buttons I have to press to get git to do what I want.)

> What did your colleague have trouble with?

I remember him trying to move a window to the other monitor and having bounce back repeatedly. Very funny to watch.

> git is a usability nightmare.

No argument here. The design of the commandline UI is terrible ("git checkout" is the first offender you're likely to meet). However, it's powerful enough that I'm using all the time, occasionally bridging with SVN when needed. Once you get hooked on cheap local branching, you can never give up.

Eh, Bazaar has cheap local branching as far as I care about. And yes, I don't have to deal with git checkout.

Window management can be funny -- I think there's a bug in compiz where a display will "reject" windows if they're too large (when minimized) for the display. This causes the drag-to-top motion to hit the display the window is currently on, and so it goes back to the original display. Using control-alt-numpad5 works, as does unmaximizing your windows and making them smaller than your smallest display.

I beg to differ. I used Redhat, Mandrake, Gentoo, Debian, and now Ubuntu and I have been happily using Ubuntu for a long time now. Unity is great. I barely need to do any configuration with it, everything just works. Under the hood it's still good ol' old-school Linux.

I have tried Mint on 2 other computers for all the talk about Mint being the next great Distro, I find it to be far less stable than Ubuntu. I have also started only using Ubuntu's LTS releases with a handful of PPAs to get more recent packages. Helps with stability a lot.

Yes. I shall follow with interest and run Ubuntu if my hardware allows. CentOS and Debian are there as lifeboats if needed.

Choice is good!

Is Mint with Debian finally taking off?

That has the potential to be a really easy choice for desktop users, but a couple years ago it wasn't very smooth at all.

One of the largest issues that I can foresee with this is proprietary drivers for graphics cards. NVIDIA has delivered high quality drivers for Linux/X11 for their cards and tend to release new drivers quickly when new X11 versions are released. Are gfx hardware manufacturers expected to support separate drivers for Linux (X11) and Ubuntu (mir)? and potentially distributions that use Wayland? I can only imagine how AMD/ATI will handle putting out graphics drivers for some other windowing system that Ubuntu decides to use.

From TFA: "Right now, Mir does not run on desktop hardware that requires closed source drivers. However, we are in contact with GPU vendors and are working closely together with them to support Mir and to distill a reusable and unified EGL-centric driver model that further eases display server development in general and keeps cross-platform use-cases in mind." Good luck with that. How many years did it take to get decent AMD drivers for Linux/X11? The catalyst X11 drivers are still currently a mess.

> Ability to leverage existing drivers implementing the Android driver model [1]

This is huge for ARM; very few vendors bother developing any sort of an OpenGL driver for X11, and drivers for same GPU often not only SoC specific, but sometimes device specific as well.

[1] http://fridge.ubuntu.com/2013/03/04/mir-an-outpost-envisione...

> What I like about the GNU/Linux ecosystem is that a lot of distros share a lot of common underpinnings, and everyone benefits from a large community fixing bugs and improving those underpinnings

Exactly. Standards exist in the GNU/Linux world for reason. Mir, Unity, etc. all don't seem to play too well with other distros and are very tightly integrated with Ubuntu; if this continues Ubuntu might turn into a slightly more open OS X, which isn't good at all.

Standards are great and all, but holding on to those standards is what's keeping back the UX of linux. Gnome3 is familiar, but very dated. X is supported by all, but it's a tangled mess that everyone hates.

Yes Ubuntu is straying from the pack, but I'd rather have them stray and the rest follow when they find something good then have the entire linux community sit in the same place they have been for years. Regardless of what path Ubuntu takes, I think it will be good for the linux community to have that diversity.

> Gnome3 is familiar, but very dated.

Have you used Gnome 3? Familiar and dated are not words I would use to describe it. The Gnome team is clearly focused on user experience and without rendering judgment as to whether or not they've succeeded (disclosure: I think they largely have), it's immediately obvious to new users of Gnome 3 (see Torvalds for the canonical case) that the product is not clinging tightly to familiar design patterns.

Yeah, like Unity cough or upstart cough.

RH is doing similar stuff, although they're more deeply involved, and while I don't especially like how they impose some of their technology, at least it's generally well though out.

Upstart is available in the Debian package repos, and it was used for a while by Red Hat before they switched to Systemd.

Fedora package != RHEL defaults. Debian package != Debian defaults.

RHEL 6.4 still uses systemv init, in fact.

EL6 uses upstart[1], the SysV init scripts are using the rc compatibility.

[1]: https://access.redhat.com/knowledge/docs/en-US/Red_Hat_Enter...

Have the Debian defaults (by some definition) had an opportunity to update since upstart was packaged for Debian?

Exactly this. My chosen distro is Linux Mint. Based on Ubuntu but going its own direction.

> Based on Ubuntu but going its own direction.

Like how Ubuntu is/was based off Debian but going its own direction? ;)

That's what open source is all about my friend! First distro I ever used was Debian, only as a live CD though because those were back in my WinXP days. But I immediately fell in love with apt. Then about 2 years ago I moved to Ubuntu, its nice but I didn't love Unity (didn't hate it either, just different). A friend recommended Mint, and I have been hooked ever since.

Much more open. Like actually open. There is a reasonable case that there has been too little innovation in the last 10 years. I first used X Windows more than 20 years ago and it was amazing then but holding stuff back with old standards now?

There are very few standards in the GNU/Linux world. GNU and Linux has, by themselves, become standards through ubiquity. POSIX is one, yes, but it's woefully inadequate when you've gotten used to all the bells and whistles any modern GNU/Linux distro puts on top of it by default.

And what about the BSDs? They certainly doesn't play well with Linux, or vice versa, but that's hardly an argument for calling either an agent for closedness.

Apple already have more developers contributing to useful upstream code than Canonical (Webkit, cups, LLVM).

Placating the tech crowd by giving away fragments of low level code is really just marketing and a way to retain talent that would otherwise refuse to work for them. It's not as if their users can actually use any of that stuff to change how their macs work (without switching to a different OS or browser).

WebKit for one has fairly wide adoption outside of pure Mac OS X. As for the rest of it, just because something is open source don't expect many people to use it.

Yes but then it's also being developed by far more companies than Apple.

For 2012 Google submitted roughly twice as much code to Webkit as Apple (which came second), apart from them we have smaller yet notable contributors like RIM and Nokia (7% and 5% respectively).

Let's not forget that reduced maintenance costs are often touted as a reason to opensource code in the first place. And when you consider it was developed by Apple and so presumably fit there needs well when released it as working code, it's not surprising that adapting it to new use cases involves more work than maintaining it.

It was developed by the KDE project as KHTML. The reason that it's open-source is that it was Apple's fork of an existing free software project.

You'd be complaining if they contributing nothing. Damned if they do, damned if they don't.

He said they weren't doing enough. You're pointing out he'd also complain if they did even less.


Actually, I was just pointing out that they are not open in any meaningful sense of the word - their products are very, very closed and controlling and that's how they like it. Comparing them to Canonical is in my opinion very silly for that reason, it's apples to oranges.

Apple is the biggest tech company in the world. Canonical is a company, that focuses on Linux User experience.

Didn't Apple buy CUPS specifically so that they could keep parts of it proprietary?

They're definitely only funneling money into LLVM so they can drop their dependencies on GCC, which will probably have a net harmful effect on the free software culture.

> They're definitely only funneling money into LLVM so they can drop their dependencies on GCC

Definitely and only? How did you determine this?

It's sort of obvious? I can't see any other reason Apple would have for it, other than making compilers researchers have slightly easier lives. It'd certainly make sense after they were forced to open source the frontend for Objective-C.

Apple had two technical needs that GCC could not reasonably satisfy. One was to compile Core Image filters. When you make a chain of filters, they treat it as a single complicated filter which they JIT compile and optimize for your GPU or CPU (I believe they decide at run time which to target based on what will be best for the particular complex filter you have defined).

Another technical need they had was for a compiler system that could be easily and tightly integrated with other tools, such as IDEs and debuggers.

Either of the above would have required extensive modifications to GCC, and they would not have been able to get those modifications into the upstream. The result would be Apple would have to maintain a fork of GCC. They'd be spending a lot of effort porting things from upstream into their fork.

So would you say that Apple is definitely only funding LLVM so they can drop their dependency on GCC?

Circular snark is circular.

> which will probably have a net harmful effect on the free software culture.

Mind explaining how funding the development of free software hurts free software?

1) Replace GPL code with BSD code 2) Release a subset of the features to open source community

Why would Apple be so allergic to GPL unless they want the ability to close stuff? If really you think Apple would stay open if they dominated the market look at the iPhone.

This is only "hurting free software" if you are a complete radical like rms, who believes that any closed source software is an affront to humanity, blah, blah. Otherwise, it doesn't matter what Apple closes, as everything that was Free before, is still Free afterwards. If you don't like Apple's stuff, fork it. If you don't like closed source software, don't use it. I don't see anybody putting a gun to anyone's head and forcing them to use Apple stuff.

This is a shallow view of the situation.

Your metric is essentially "no knowledge is lost from the Free world." However, this means that all free software developers could die tomorrow and your metric would be satisfied.

A better metric is "the most free software that could be made is made." Switching from a GPLd compiler platform to a non-copyleft "freemium" Apple platform is worse for this.

If you don't care about free software as a political goal, there is no purpose to have this discussion in the first place, as you don't care about free software being hurt.

I think you're setting up a false dichotomy here. "You care" or "you don't care". Reality is more nuanced than that. Also, the idea of "all free software developers dying tomorrow' is just hyperbole, since the chances of that happening are essentially nil.

Yes, we'd all like to maximize the amount of F/OSS software in the world. F/OSS is a Good Thing, and I founded a company based around F/OSS for a reason. But it's not this horrible tragedy / affront to humanity, if Apple (or whoever) takes a step away from a purist "Free" software position.

To put it another way: no one owes you (or me, or anybody else) a world full of all the Free software we want. And even more so when plenty of people in the world don't care about software freedom. As long as those of us who do care have the option to fork and continue development of projects, then the actual freedom remains, as far as I'm concerned.

> Also, the idea of "all free software developers dying tomorrow' is just hyperbole, since the chances of that happening are essentially nil.

That is exactly my point. You proposed a metric for the health of the free software culture: "Nothing that is free now is non-free in the future." This is a bad metric because if you eliminate all free software development, it reports "everything is okay."

You proposed a metric for the health of the free software culture: "Nothing that is free now is non-free in the future."

It seems to me that you're turning something analog into something binary. "Nothing that is free now is non-free in the future" is true, relative to any particular project, and is - as a worst case - not so horrible. But applying that to "free software culture" in general, as a comparison to the idea of ALL free software development stopping, doesn't sound reasonable to me. It's like you're suggesting that, say, Apple, moving away from GPL'd "Free" software towards BSD software (and possibly a "free core" model) automatically implies that everybody else does too. But that's just as likely to happen as your hypothetical of every Free software developer dying tomorrow.

The "every free software developer dying tomorrow" is a hypothetical statement that has nothing at all to do with Apple. It was in my comment because it's a great way to illustrate that even if things aren't explicitly subtracted from the Free world, the Free world is harmed if its growth is slowed.

the Free world is harmed if its growth is slowed.

OK, but "harmed" is a broad term. I'm harmed if I stub my toe, but I'm also "harmed" if I fall of a bridge and land on my head and crack 4 vertebrae. But there's a big difference between those things.

And anyway, my point was (at least partly) that the growth doesn't necessarily stop because of - for example - the Apple deal (or something like it) because people can always fork. And if the last GPL'd version of something that goes closed is popular enough, it gets forked. See: Nessus[1] / OpenVAS[2].

[1]: http://en.wikipedia.org/wiki/Nessus_%28software%29

[2]: http://en.wikipedia.org/wiki/OpenVAS

how is it "slightly more open"? from what i can see, all the code in ubuntu is open-source to the fullest extent of any accepted definition out there, including the full freedom to fork and start a distro that pulls from both ubuntu and third-party patches if you feel that canonical is not accepting said third-party patches readily enough.

On the one hand those standards came from somewhere, and that somewhere is likely someone else forking an existing project and making changes to it.

On the other hand, adhering to standards that are years old isn't a great way to advance. Standards have to grow and adapt and, wait for it, change.

Which standards are these? Besides using the kernel compiled with gcc and using dynamic linked ELF binaries as their default and a libc on the system the only standard that I ever knew of was LSB, which I believe does not include anything about graphics (I can be wrong).

They're defacto standards. Not formal standards. The fact that you can compile and run a GTK program that will run on most linux and even other unix like systems without problem.

It means that someone creating a program, proprietary or open source, can, with little effort run that program on most linux distros. We have quite a handful of internal GUI tools, some use Qt, some use GTK, written in python and some in C or C++. The GUI stack makes up the biggest stack here, but they use a few other libs as well, e.g. cURL, openssl, gsl.

All these are readily available, and we run these on various versions of RHEL, Debian , Ubuntu and FreeBSD beacuse of the defacto "standards" provided by these environments. Even porting these (at least some of them) to windows, where many of these libraries are available would take too much time because of minute details - there's other real problems we need to focus on. We hope we still can run these programs on Ubuntu in the future though, but it will not take that much to make it infeasible

I actually do share your fear. That's the reason I run Debian. I really hope this does not destroy Ubuntu as a Linux distribution (which I believe is the standard you and the poster I answered was referring to), as long as it remain based on Debian I believe there's little to fear.

> Standards exist in the GNU/Linux world for reason.

That reason being that they grew out organically. And the landscape isn't even unified as it is! Mist distros come with Gnome, but you can run KDE if you are a bit more leet, or if you want something more lightweight you have xfce. Unless...

I'm happy Ubuntu is showing some vision for what their OS can become and work towards that goal.

Slightly more open?

In what way is Ubuntu closed?

They are not talking about Ubuntu being closed. They are saying that Ubuntu would drift toward being more like Mac OS X, but be more open than Apple is with Mac OS X.

They said "slightly".

OSX is hardly open at all. Only a few pieces are (important pieces, but still). Something completely open is more than slightly more open than OSX.

An open OS X, while slightly oxymoronic, sounds pretty fantastic to me.

> Maybe this is the kick in the pants Linux needs to increase adoption. But I would much rather know GNU/Linux, not Ubuntu.

I'd rather know Unix, not GNU/Linux or Ubuntu. Maybe Ubuntu diverging a bit from other Linux distributions, and from GNU, will encourage people to write portable code.

Canonical might have problems with the current de'facto toolchains of the GNU/Linux ecosystem, or it is to large or unmaintainable for what they want to do with it. Like you said maybe its a good idea, maybe its not.

Starting clean you can take all the knowledge gained from previous open projects and create something fresh that is more lightweight, faster and easier to maintain. But will this new components live up to it expectations? Only time will tell, and Canonical is willing to back that venture.

I also believe one need to stop calling linux, linux or the lesser adopted but more correct term for me GNU/Linux. Call the operating system you use by its distribution name, that is the actual ecosystem you using. Which yes, is part of a greater Open Source, Free Software ecosystem.

But to achieve inovation one needs to boundaries, and at the moment Ubuntu is the OS on the desktop with most commercial and adoptions success.

My 0.02c.

There's basically 2 ways for them right now:

1) Make the GNU/Linux stuff work on Android. That's what Tizen does, AFAIK. It's a lot of work. Advantage: everything works with it, and it's well understood.

2) Glue stuff up to work with whatever Android does. That's what they're aiming for. Advantage: less work.

Why didn't the Android team make the GNU/Linux stuff work on Android; Because it is a lot of work?

Why should another company then spend resources to do that work.

I don't understand what you mean with your second statement. But that might just be because I'm dead tired. But I can not stop myself from reading and typing!!1

> Why didn't the Android team make the GNU/Linux stuff work on Android; Because it is a lot of work?

Because it's GPL licensed. That's it.

Google has spent a lot of time and work ripping out perfectly good pieces of code that are GPL licensed and replacing it with (sometimes inferior) BSD or Apache licensed pieces. At some point, the only piece of GPL code will be the Linux kernel.

Device manufacturers' lawyers are very paranoid. That's (part of) the reason Android has its own BSD-licensed libc, "bionic", instead of glibc.

When Google ported Chrome to GoogleTV, which runs Android, they ported glibc, too. Apparently, it was easier for them to port glibc to Android than port Chrome to more limited bionic.

Funny thing is that Debian is already at Google Play.

Yet in the end, you pull the entire Linux FOSS community towards whatever whimsy Google is going on at any given time. If they suddenly redid the graphics API and threw out their current binary blob batch, the FOSS space would have to get back up and try again after getting the rug pulled out from under them.

I don't see it as good if Google can lead FOSS desktop development around by a leash.

Are you saying that Canonical is the 'entire Linux FOSS community' ???

I switched my personal computer from Ubuntu some time ago. It's kind of startling when you see the divergence from core Linux tooling that Ubuntu's been doing.

I believe this is the kick in the pants my team at work needs to evaluate moving to Debian deployments instead of Ubuntu.

Yup, I can understand that, but it looks like Canonical are moving to phone/tablet/turn key device/fridge market anyway.

I think there is room for Canonical with a slick tablet OS and Debian/RHEL clones/The Others (Arch/Slack &c) as 'traditional' OS

They are not changing the path of Linux, but the path of graphical UX on Linux. I think it's a good thing. The GUI is simple enough for my grandma to use, yet I can open a terminal and have a very familiar experience. I see no harm here, as they will not diverge from Linux itself.

Critically, they seem to be putting a lot of effort into being inter-operable with Wayland and X. We have two major widget toolkits and it's not a problem because I can use applications from both without any trouble.

As long as what Ubuntu is creating is open source and easily adopted by others (if it's good), what's the problem?

Wayland is already mostly there, was put together by experienced X developers who've given up on trying to make X do everything modern display environments work. The main problem with moving away from X has been the lack of a single, demonstrably better target to move to. Mir will fragment that effort, and the rationale seems to be "NIH".

The 'if it's good' part is the problem... Canonical don't exactly have a good record of creating well-engineered software...

On the other hand, innovation and inventing can be good. There are new things all the time (e.g. git).

> What I like about the GNU/Linux ecosystem is that a lot of distros share a lot of common underpinnings, and everyone benefits from a large community fixing bugs and improving those underpinnings.

Mir is FOSS too.

I don't know if this is necessarily a bad thing or not. I mean, sure, it's handy to be able to switch distros seamlessly, but has that ever really been possible (aside from going from, say, Ubuntu to Debian, or Fedora to CentOS)? Going from Fedora to Ubuntu or vice versa, for example, has never really been a completely seamless experience.

It's always been the case that, in the strictest sense, "Linux" is a kernel, not an Operating System, and each distro is really it's own OS. That they had a lot in common was a fortunate bit of happenstance in a lot of ways. Now, they start to diverge, that may mean more competition, which should lead to faster innovation and even more progress. And as long as everything is F/OSS, the distros that pick a bit of tech that "loses" can always switch to the "winner" later.

I'm not saying that it would be totally pain free, mind you. But I can see how this sort of move might benefit everyone in the long run.

It doesn't sound like a good idea. What's wrong with Wayland?

From the article, and I'm sorry if this seems condescending, I would truly love to have a conversation about this, but I believe your question is to vague:

* The input event handling partly recreates the X semantics and is thus likely to expose similar problems to the ones we described in the introductory section.

* The shell integration parts of the protocol are considered privileged from our perspective and we'd rather avoid having any sort of shell behavior defined in the protocol.

[1]: https://wiki.ubuntu.com/MirSpec#Why_Not_Wayland_.2BAC8_Westo...

What worries me is fragmentation of drivers. It can be a very serious problem.

Possible scenario:

Nvidia wait for broader Wayland adoption to release their driver for Wayland. Comes along Canonical and drops Mir on our heads. Nvidia scratch their heads and say - forget Wayland. Not sure if they'll say forget Mir as well - they have better things to do than to deal with this mess, but all this doesn't sound good already.

Consider games development. People wait for better drivers (Wayland)? They aren't coming. Should they say forget this mess (Linux)? Or Canonical think they should refocus all driver development on Mir like Android did on mobile creating a horrible fragmentation again?

I don't think so. The reason we've had Nvidia and AMD support on Linux for so long is because of the large 3d/sfx and of 'gpu accelerated number crunching presence' Linux has.

The end user Linux desktop market has never really been on their radar, perhaps that will change somewhat with the advent of Steam, but really I expect that unless the aforementioned markets adopt Wayland or Mir (with the latter being most unlikely imo) then NVidia and AMD won't target them with proprietary driver support.

It looks like both Wayland and Mir are supposed to use KMS drivers.

Do you have a link to this assumption? I would really like to know.

I found these:

* https://wiki.ubuntu.com/MirSpec#Mir_on_the_Free_Graphics_Dri...

* https://wiki.ubuntu.com/MirSpec#Mir_on_HW_Supported_By_Close...

we are in contact with GPU vendors and are working closely together with them to support Mir and to distill a reusable and unified EGL-centric driver model that further eases display server development in general and keeps cross-platform use-cases in mind.

I hope the later means Wayland compatible drivers, because if not - it's going to be bad.

My interpretation (and apologies if this is obvious to you) is that they will define certain interfaces between the driver and the outside world, and any manufacturer will be able to provide something (a library or a daemon, I guess) that implements those libraries.

When they say "cross-platform use-cases", I think it means they want to make it easy to build the Mir driver and Android driver from the same source tree - and quite possibly the Windows and OS X driver too. So hopefully they will have some quite general interfaces.

I think the ideal scenario is that the driver interface they define has a very general interface, and then there's a wrapper that turns that interface into an Android driver, another that turns it into a Wayland driver, another that turns it into an OS X driver, and so on. But I don't know if graphics drivers would accept the performance penalty of even a very thin wrapper.

I see all this as a huge waste of effort, unnecessary complications and increasing confusion. The main drive behind this seems to be desire of Canonical to control the development (they can't do that with Wayland), rather than any valid technical reason. Quite disappointing.

Here is what Wayland / X.org / Mesa developers have to say on this:


Let's hope.

I don't take the driver situation into consideration and it seems that Ubuntu is contacting closed source manufactures to allow driver support for mir.

I like your scenario because I believe a interesting element is that Valve is trying to port most of its Steam games to Ubuntu and they do have muscle with Nvidia and AMD. And with the rumours that the steambox called Piston would be based on Ubuntu.

So in my opinion it looks bad for distro's not using mir and good for distro's using mir. If that is a good thing altogether is another debate.

And as mentioned above profesional game development to Ubuntu is coming at a rapid pace. Maybe at the price of another horrible fragmentation as you call it.

I don't consider Ubuntu usurping the focus of game development and drivers a good thing. And notion like "using Mir - good for you, not using Mir - too bad, no drivers" doesn't sound good as well.

Ubuntu is already quite isolationist. This will only make matters worse.

Why would you say Ubuntu is taking game or driver development by force or illegally? As far as I can tell there is no other competition in this space, so it is actually very open and its not like Ubuntu is not willing to share.

I agree with the notion, but we still need to see if it will play out that way. For now its only on the table, with a lot of other cards we are not even aware of.

I didn't say anything about "illegally". It's all open - "do what you want". It's the question whether it benefits the global Linux community, or creates unnecessary internal competition / fragmentation and spreading of resources.

Competition for drivers can start (if they won't be compatible), look at Android targeted hardware and see how sick it is now (it's SurfaceFlinger only, try getting Wayland drivers and good luck with that). We don't need the same story repeated with Mir.

Sorry I did a define on usurping and ended up being pedantic because I don't believe its true.

I understand all the concerns about how it will impact the global community; But isn't it generally seen that competition is good, and who's resources is it spreading or fragmenting?

* A distribution who tries to cater to everyone? * Other companies employees who now need to understand and support technology it doesn't want to use? * "Linux" users in double quotes because they want to be able to administrate all their different boxes with different distro's using the same old tools of 20 years ago? * FOSS coders who spend their free time fixing bugs and adding features to projects they like and use?

Call me ignorent and please point out why. But I don't see any of the above as resource issues. The only problem is the "Linux" users and I feel if they start to see themselves as distro's users instead of a name given to a Kernel and Foundation, that problem will sort it self out.

By your account it seems like it already started and Ubuntu might be able to bring in a second or third option.

By the looks of it the greater community wanted Ubuntu to rather spend time in bending tools to do what it needs to do, to benefit the community. Without realising Ubuntu is a product of a company that has goals and can not always take the long road of community inputs, but at least it tries do be open as much as it can.

Which gets criticized a lot because it is a leader in a "linux" environment it mostly created from scratch.

Competition is good when there is choice, i.e. when there is actual competition. When this plays out as an edge case (i.e. monopoly like situation) i.e. stuff like "SurfaceFlinger only" or "Mir only" - it's not good.

Canonical draws its success on the community (Debian and others). With that, being selfish and isolationist, while at the same time claiming to be "the Linux" is not considered respectful. I personally avoid Canonical/Ubuntu because of that.

From the discussion on google+ by the wayland developers [1], it looks like that the reasons given by canonical are quite poor.

[1] https://plus.google.com/100409717163242445476/posts/jDq6BAgd...

>But it worries me that Ubuntu is increasingly striking out on its own.

It's a little distressing, I agree, but on the other hand, are you ever really "out on your own" when you have a big user base and your code is GPL?

Would you care to elaborate on why you dislike launchpad? I haven't used it as a developer, but as a user I've found it really useful, particularly the way it integrates with apt-get via PPAs.

I don't understand all the negative reactions. Canonical is recognizing various problems in making GNU/Linux mainstream. They are then innovating at a deeper level (fixing root causes rather than duct-taping) to ultimately attempt to really attract the layman to a mobile or desktop GNU/Linux distro. Devs don't need to target Mir if they don't want to, Linux users can switch to another Debian if they don't like it, and the Layman discovers that Linux can possibly be just as shiny as Mac OS. Can someone explain to me why this is all so horrible?

The Wayland people really sound like they know what they are talking about, based on decades of experience. See, for instance: https://www.youtube.com/watch?v=RIctzAQOe44

Are these people designing a better answer to Wayland because it's really better, or because it always looks easy when you don't intimately understand the problem space? I honestly don't know, but it's a question I'd like answered before I get excited.

It looks like they have design goals such as using Android drivers that Wayland doesn't address.

I'd like to hope since Wayland has been in development hell for ~5 years now, that if the Mir people are doing their job, they are intimate with both X and Wayland, see where Wayland improved, and where it made mistakes, and fix those.

I think there are probably three obvious reasons.

1) For the past two years or so the community has been led to believe Canonical would be adopting Wayland. After the slow build of anticipation over that, it was dropped out of the blue.

2) Simple distrust of Canonical's homemade projects after the Unity fiasco.

3) Fragmentation of effort to unseat X. The chance of repeating the history of every other display server that was going to "replace X within Y years" becomes that much higher with two competing alternatives marginalizing each other.

If this news had come three years ago I imagine the response would be wildly different.

1 ) I was also under that impression, but coming to think about it who created that expectation, the community or Canonical? I don't know.

2) Calling unity a fiasco, is like calling it a failure. Which I believe is simply not true.

Yes some people didn't like it and moved on or made their opinions heard on forums etc.

But Canonical has a plan with Unity, and at the end of the day it gives a better user experience to me and hopefully a lot of other Ubuntu users.

3) There is reasons to unseat X and there is reasons not to unseat X. But what to unseat it with lies with the people that have certain problems, and maybe the problems Ubuntu face with mir is differently to the problems wayland and its community face.

I personally would like to see two different tools, then one large tool like X, and if you read the article they explain why they decided on mir instead.

> 1 ) I was also under that impression, but coming to think about it who created that expectation, the community or Canonical? I don't know.

Mark Shuttleworth created that expectation, when he made a blog post saying Unity would be switching to Wayland: http://www.markshuttleworth.com/archives/551

Thanks for the link!

I did a quick google for "ubuntu having problems with wayland" and some interesting links got returned regarding developers struggling to get wayland working, so it didn't get dropped suddenly.

It seems that the mir spec is a public statement that after 2 years they are giving up for a alternative, with reasons and a alternative.

With some individuals up in arms with the discusion, but how should Ubuntu have handles it differently, while keeping to a deadline.

> Calling unity a fiasco, is like calling it a failure.

This has gotten singled out so I'd better say that I meant to find a more polite way of saying "shitstorm". It's not a judgement on Unity itself, but I think it's fair to call the reaction at the time a fiasco.

re: Unity - agree. Although I was initially skeptical, now I like it, and it makes more sense in the context of ubuntu on tablets.

I still use Unity, for some reason, and I am shell-shocked it was allowed to be released.

I've never worried about window focus failing to work properly until Unity. I've never had terminal windows fail to be refreshed when scrolling text until Unity.

"Unity fiasco": any data on the net decrease of ubuntu users caused by unity?

I personally dropped Ubuntu for Linux Mint.

I've heard a lot of people say they hate Unity.

It's not hard data, but it's anecdotal evidence that there's a substantial set of people that really hate it.

Personally I love it. I had planned on installing a tiling window manager, but after 10 minutes with Unity I decided to stick with it. It's the first Linux GUI I've found ok since Enlightenment 0.16, and the first time since I left the Amiga behind I've had an environment I'd say I'm happy with.

And most people I've heard who have used Unity love it. The only people I hear complaining are typically people who expect the Linux desktop to remain static and unchanging the way it was when they first used it.

Unity might be a fiasco with a small subset of old Linux users, but Unity or something like it will be essential for Canonical to keep attracting new users, who are expecting polish at the level of OS X rather than at the level of the Linux desktop of a decade ago.

>The only people I hear complaining are typically people who expect the Linux desktop to remain static and unchanging the way it was when they first used it.

Or who are stuck on older hardware (why was 2d retired, again?)...

Or who think it's utterly silly you have to write a INI (.desktop) file to put arbitrary links in the "dock"..

Or who think that the interface wastes a lot of space...

You've went and generalized a lot of people with legitimate complaints about that DE. Kind of like Ubuntu's developers do.

I despise Unity. I'm only using it right this minute because our company standard desktop at the $DAYJOB is Ubuntu.

Personally I find the state of desktop Linux to be fairly close to abysmal at the moment. I'm not sure we're any better off now than we were with Gnome circa 2000 or so.

I had, and still have, high hopes for KDE once Qt went LGPL, and I do run KDE on my personal laptop. It's not bad, but there are some major pieces of software that don't have Qt native versions. sigh

I like cinnamon, but you can run cinnamon on Ubuntu and don't need Mint.

What are the pros of sticking with Linux Mint? Are they performance based?

Regardless of the technical considerations, Linux and its surrounding ecosystem has a long history of many different companies co-operating on core software. Canonical's philosophy seems to fly directly against that - so far we have:

Launchpad / Upstart / Unity / Mir

Launchpad is free-software by name only, and Canonical actively discourage you to setup your own instance.

Upstart hasn't been widely adopted outside of Ubuntu, and has been replaced by the technically superior systemd.

Unity has been extremely unpopular from a user-experience point of view, and now we have Mir. So past history isn't filling me with confidence. Their philosophy seems to be "patch first, ask questions later".

I'm amazed how Canonical has the resources to keep branching out so much while producing a distribution every 6 months. I was under the impression they weren't yet making a profit.

Upstart is a bad example. pacman is a package manager only used by Arch Linux and nowhere else. Should Arch abandon it because it hasn't seen wide use yet? Better yet, there are plenty of package managers that are better than both aptitude and pacman (http://nixos.org/). Should we abandon the technically inferior solutions for the technically superior ones?

Unity is the same way: it is open source. If you don't like it, fork it and fix it. Or use one of the alternatives. This is not Windows or OS X. Vote with your feet and move over to Gnome 3, Xfce, LXDE, or another DE. If enough people do, Canonical will see the effects and stop putting effort into Unity.

IMHO, Linux and its surrounding ecosystem has a long history of everyone trying to pull it into their own direction. Companies are often forced to cooperate when they don't have another choice, but if they did, they would push their own ideas of what the infrastructure should be on everyone else.

I'm an Arch user. I love pacman. So easy to use.

But I would personally much rather see pacman (and every other distro-locked package distribution format) dropped where everyone adopted debs or rpms.

What I would like the most, though, is that the most logically technically advanced and easy to use package manager win, and everyone just adopt that. I know that pacman makes building package builds insanely easy, compared to debs or rpms.

> Upstart is a bad example. pacman is a package manager only used by Arch Linux and nowhere else. Should Arch abandon it because it hasn't seen wide use yet? Better yet, there are plenty of package managers that are better than both aptitude and pacman (http://nixos.org/). Should we abandon the technically inferior solutions for the technically superior ones?

A couple of points:

- one example does not plenty make

- Nix has a really interesting technical design. However, it trades the ability to patch security holes across your entire system for the ability to install multiple packages side by side and have atomic upgrades.

More importantly (I was lying about the two points), the package manager is tied intimately to the guts of the distro, it's much less shocking than having Ubuntu implement Unity.

Now, we'll see what becomes of Mir. Whatever mindshare Upstart had went straight to systemd as soon as it came out. If Mir is primarily an expression of hubris, I expect it will follow Upstart in obscurity. If Canonical manage to convince enough developers in the community, they may get something going.

It's not the same. Unity heavily relies on Ubuntu specific patches. There was quite some effort to get unity running under arch Linux

"better" is generally subjective. there's a reason why nixos, upstart, etc, don't catch on.

For pacman vs nix, well, pacman is extremely simple. Extremely reliable. Nix has some design advantages, but that don't translate all that well in the practical world right now. Maybe in the future.

Upstart was used because it was better than traditional init, and was used by Fedora as well. Then again, Linux could have had a much better init system many years ago, but the GPL zealots wouldn't hear of having launchd, and its less restrictive license, be used.

I don't think that's the reason. Apple even relicensed launchd to make it easier to use. It still didn't get adopted.

To clarify, my comment was that even before relicensing, launchd was under a less restrictive license than the GPL -- the APSL allows for binary linking -- but had clauses that didn't allow people to GPL their code. By the time launchd was relicensed, ubuntu had already released upstart, and another init replacement so quickly would have alienated people.

I don't see the problem in wanting to keep the guts of the OS forced-free by license. If any of the deepest components of the Linux stack were MIT or BSD licensed, it wouldn't take long for some business to fork it into a proprietary blob they push for market adoption and you lose the freedom deep in the OS.

With something like init, I would have to disagree. The biggest problem with this decision is that they ended up with an inferior project that wasn't nearly as flexible as the open options. While there are certain areas where keeping things under a restrictive license like the GPL may be useful, the init system is one where I can't see of any way how a closed system would offer any sort of competitive advantage.

> Unity has been extremely unpopular from a user-experience point of view

Has it? Do you have numbers?

Because I see a small number of people who complain very loudly, and often make sweeping claims about how unpopular it is without any evidence, who always get followed by responses from people (like me) who love it...

upstart is used by the current release of RHEL, but it's true that Red Hat has signaled that systemd will come to RHEL as well (Fedora now comes with systemd as far as I know).

I don't see the problem with then using whatever they want, if it provides a good solution, when upstart was developed systemd wasn't even a viable alternative, for me it still is not if stability is what you want and run on a non-desktop system.

Exploring Android internals I much SurfaceFlinger than X in my understanding of its architecture.

"Launchpad is free-software by name only". No, it is actually free as in freedom software. I think you meant to say that it's free by license only.

Don't forget the copyright assignments required by Canonical. Open source, but we get to own your source" is a thin model of open source.

> the Layman discovers that Linux can possibly be just as shiny as Mac OS.

Why do we want the layman to be using Linux? To give Valve more Linux customers or something?

We should let them be. Laymen will invariably be better served by Microsoft or Apple, trying to win them over in some sort of misguided drive to "win" market-share seems foolish. Linux should focus on it's niche, and Microsoft and Apple theirs.

This market-share envy makes no sense to me. Does Artic Cat stare with envy at Ford's userbase and expend effort trying to get suburban parents to drive their kids to school on snowmobiles? That would just be silly.

Practically: I want to be using a system that I can be confident has good hardware & software support, not always double checking everything I buy to make sure it will work. That means getting at least a few % market share so that other companies take it seriously.

There's also an ideological angle: many Linux users feel that there's something wrong with a world where only the technically adept can use a free, open source operating system.

> not always double checking everything I buy to make sure it will work.

I feel like we've been there since... 2007? My last several purchases were made with zero caution or research anyway, maybe I've just been getting lucky.

> many Linux users feel that there's something wrong with a world where only the technically adept can use a free, open source operating system.

That's fine and all, I don't buy into that ideology but it is fine if others do. Perhaps someone should warn these unsuspecting masses that "Linux for your technologically illiterate grandmother" advocates are not as pragmatic as they may claim to be. I feel like these people are dressing Linux up as something that it is not in order to sell it to people who, were it presented honestly, would not pay it a second glance.

The hardware situation has got much better, but there are still some problems. Laptops with both integrated and separate GPUs aren't very well supported (so I hear, I don't have one). There's a relatively new scanner in my office that's not supported by SANE. I wanted an external wifi adapter for a desktop recently, and I'm not sure how many will just work with Linux.

On the software side, Steam got ported recently, but there's plenty of other software that we'd like to see. Flash is no longer released for Linux, although for now at least I can still watch iPlayer with the last version of Flash that was released. Hopefully HTML5 will have killed it before much longer.

Of course we shouldn't pretend that Linux is something that it isn't. If it's not good enough, we aim to improve it. But I honestly think that, if non-technical users got something like Ubuntu 12.04 pre-installed on compatible hardware, it would work perfectly well... except for the expectation, both from users and app developers, that everyone has Windows. Which brings me back to the point about market share.

Even drivers that were previously mainline have broken. So while the Intel 4965 used to work fine on Linux, it no longer does. And that is for a driver in mainline.

This is a fundamental misunderstanding of Ubuntu's raison d'etre, which is to bring Free (libre) software to the world.

Except the rest of the (consumer) world doesn't value libre as a first-class feature. They want the shiny.

The way we get to bring our values to the world is to be commercially relevant. Because otherwise, hardware vendors don't care about you, ISVs ignore you, app developers target other platforms, and so forth.

Market share isn't the end goal. It's a means to the end.

"This is a fundamental misunderstanding of Ubuntu's raison d'etre, which is to bring Free (libre) software to the world."

Here's my marketing campaign, sad as it is, its the best possible spin:

"Hey world, I've got something new for you, its incompatible with everything and doesn't work with anything and can't actually do anything you want, but, hey, its new so it must be better and it does a bunch of stuff that nobody outside the dev community cares about. Oh and we "had" to take some stuff away like network transparency which is awesome, err, I mean ignore the man behind the curtain and someday VNC might work, oh and every dev who's finally gotten used to the existing weird design will have to retrain. Other than that trust me its awesome you better switch now. And by switch, I really mean it since it'll be incompatible with everything else ever written." How could anyone not drop their entire legacy installed base of hardware and software and training and switch, I mean its got transparent animated dancing robots which no one can do without, and its got electrolytes that plants crave!

I'm sorry I just can't do any better given bad news.

Seems dishonest and manipulative to me. Con a bunch of people into adopting something that doesn't really fit their needs, is just shiny, so that you can use them as leverage... that is not something I am interested in.

Meanwhile all of those unskilled users are not without a cost. Free software support infrastructure is not set up to handle that many unskilled users. If Canonical is going to be giving all of those people 3rd party support themselves, that is all well and good, but it seems to me they tend to punt the ball.

The entire reason why desktop Linux has been a failure is exactly because it does not fit the needs of most people. And for that audience, "shiny" is equally important as "featureful".

In the modern marketplace, you can't choose either/or. It has to be both.

Ubuntu wants to do both and additionally add libre.

The strawman fallacy you inferred from my statement has the additional fallacy of incorrect causality.

People are initially attracted to Ubuntu for a variety of reasons, and we hope to keep them because we are fit for purpose, not for whatever ideological reason or niche feature, and certainly not due to any sort of con job.

Desktop Ubuntu has failed to reach that critical mass of users to be taken seriously by industry (our good friends at Valve notwithstanding), and the only logical place to jumpstart the install base is the mobile world.

The mobile market is brutally competitive. Ubuntu will succeed or fail on its own merits, not because of any dishonest manipulation of innocent users.

Has desktop Linux been a failure? By what measure, adoption of people who are not in it's niche market? Would it stop being a "failure" if we got a bunch of laymen using it?

Compare community desktop linux distros with community distros of something more oriented towards the masses... say.. Android.

The Debian Project is a well oiled machine, coherent and consistent. Everything has a well defined process. Technically capable users don't have to sort through piles of crap to get work done.

Cyanogenmod on the other hand is an utter shitfest. Standard operating procedure there is wading through clusterfucked forums looking for undocumented unofficial fixes (in prebuild binaries mind, no source) in threads thousands of posts long filled with idiots saying "hurr durr, I dropped my phone in the toilet, now this patch doesn't work". They cannot handle the size of their technologically illiterate community, and everyone attempting to use their software, technical or otherwise, suffers as a result. (This isn't even touching the issue of shittier hardware support than you would ever expect to see with desktop linux...)

If piles of unwashed masses were really what desktop Linux is "missing", then community maintained Android distributions should, far from being a nightmare, be the promised land. This is plainly not the case.

You keep putting up strawmen.

Show me the desktop Linux equivalent of Android, not Cyanogenmod. It doesn't exist - and you can measure that failure with any number of metrics.

If Ubuntu can achieve that type of success and protect the freedom of users with the GPL (instead of the permissive BSD model for hardware makers) then we will have accomplished something in the world.

You have missed the point. Show me the Android equivalent of Debian.

Show me an Android project, with all of the unskilled users that come with it, that is anywhere near as organized as the Debian Project. Show me that legions of unskilled users have allowed this Android project to achieve hardware compatibility at all comparable to what Debian achieves on the desktop.

Such a project does not exist. I assert that it does not exist in no small part because they have too many unskilled users, and because hardware support does not materialize as soon as you reach some sort of "critical mass" of unskilled users.

Idiot users are toxic; anything that touches them rots. Only corporations that are prepared to completely disregard community involvement are capable of wielding an idiot userbases. Canonical is neither up to that task nor does it even appear to be pretending to be. Why? Probably because the lunatics run the asylum.

Why do we want the layman to be using Linux? To give Valve more Linux customers or something?

Idiot users are toxic; anything that touches them rots.

Wow. The level of arrogance and elitism on display here is breathtaking.

This is why Linux on the desktop is irrelevant. Not the fact that it's incompatible with most mainstream software, not the fact that it's low profile or the fact that most Linux desktops look and function like Windows' retarded younger brother – but because a tiny minority of the otherwise hugely welcoming community set themselves up as some sort of entitled priesthood and actively discourage 'idiot users' from getting involved.

New users, even idiot ones, are good. Yes, they bring problems and stupid questions. Everyone has to start somewhere. But more people involved means more investment and helps challenge entrenched assumptions about how things should work. Making things shinier and more accessible does not equal dumbing things down.

Why shouldn't everyone be able to use an approachable, thoughtfully designed system that 'just works' and yet shares our values of freedom, community and open source?

>but because a tiny minority of the otherwise hugely >welcoming community set themselves up as some sort of >entitled priesthood and actively discourage 'idiot users' >from getting involved.

I've spoken to many people over the years and looked at why they use Windows rather than Linux. The reasons, in order from most common are:

1. Lack of the applications they need; nearly always Outlook and Excel

2. Familiarity with Windows that they feel they've invested a lot of time in

3. Simply didn't realise there even was an alternative

Out of all the people I've talked to, there have been a grand total of ZERO that have ever given "Had a bad experience with a member of the Linux community" as a reason. None. Ever.

So, I'm more than a little dubious about your claim that this is in fact the main reason. Got any figures to back up that little theory of yours?

> Making things shinier and more accessible does not equal dumbing things down.

Except in practice, it almost always seems to. Which is why I don't and can't do any significant development, or work of nearly any kind for that matter, on Android, or an iPad.

> Which is why I don't and can't do any significant development, or work of nearly any kind for that matter, on Android, or an iPad.

That's about picking the right tool for the job, not dumbing down.

It's like complaining that a trowel is a dumbed down shovelling device because it can't make any significant headway with digging a massive trench.

That said, I develop Drupal sites locally on my jailbroken iPad and it's pretty nifty to have access to a full development environment on something that portable.

Arrogance or not, the fact remains that community developed Android distributions have all the unskilled users Ubuntu could ever dream of having, yet the projects are even worse.

Projects that rely on community support simply cannot handle the load. It is a failing of community driven open source to be sure, it would be great if it were otherwise, but I simply see absolutely no reason to think it is not reality. Your theory is nice, and it would be nice if it were reality, but it does not fit the data.

>Idiot users are toxic; anything that touches them rots.

implying Firefox and Chrome will rot.

Firefox and Chrome have backing organizations that are willing to not rely on community support.

(And frankly, I would argue that Firefox has been rotting for years now, though likely for unrelated reasons.)

Show me the desktop Linux equivalent of Android, not Cyanogenmod. It doesn't exist - and you can measure that failure with any number of metrics.

Sounds like what you're really saying is "you can come up with any metrics you want, then define that as failure to justify my position". There's really no objective basis for saying "desktop Linux is a failure", especially when there's no objective standard by which we could say "desktop Linux is a success" either.

Like somebody else just said "market share? Who cares?" Market share is a means to an end, not it's own end. Desktop Linux works reasonably well for a certain population of people with a certain set of values. Does anybody expect that desktop Linux should gain Microsoft Windows like ubiquity?


It's an incredibly useful tool, with a lot of very smart engineers contributing awesome stuff to it. It's a development platform second to none and far more capable than any other OS I know of.

Oh did you mean market share? Who cares?

I don't think it's conning them into something they don't need. I think it's giving them what they want (shiny, easy to use) so that they have something that they need but don't realize it yet (libre).

I agree. But then the great thing about Linux is we don't have to use the layman distribution.

If someone wants to put the time and effort into making something that is great for my grandparents but awful for me then more power to them.

Meanwhile the rest of us can stick to the distros that are suited to our needs. I think this state of affairs is much better than any one operating system ruling. So long as we have binary compatibility among the Linuxen then I'm happy.

Who are we? I only see a Company and a Foundation geared to serve a platform namely Ubuntu and not Linux.

Yes Ubuntu is based on a FOSS ecosystem but it does not mean it shares the same goals as linux, whatever the goals of linux are.

It means its backed by a company which in my eyes want to bring the wonders of a second unix based operating to the laymen. Only in this one everything is open, and that for me makes sense.

It could help make the software market more competitive. We can already see that Apple and MS want a lot of power over their platforms to the point of being able to decide what software will run. A more neutral platform in wide use could make the playing field more level thus leading to better software in the long run.

I agree. This is why I dislike Ubuntu. It constantly tries to appeal to Mac and Windows users.

Simply don't use it...

Embrace extend extinguish.

Wait until the only way to run FF / Chrome / 3-d accelerated video driver / vi / emacs / something else thats "critical" is to switch to Mir. After all its free, all you have to do is abandon everything else. What could go wrong?

Simply don't use it is like telling people to take up a new hobby once theirs has been destroyed. But I don't want to take up a new hobby and I like this one. Sorry, no, someone else has decided to destroy it, so don't complain or try to work around the damage just select a new hobby. Hey, you look like a technical type of person, sorry about the death of free software and all that, but I bet you'd like Ham Radio or maybe advanced model railroading? ...

If a free software program decided to stop supporting X and Wayland, it would be because no one cared enough to contribute the code for continued support, which would imply that very few people were still choosing to use X or Wayland over Mir. If Mir really ends up being that wonderful, why shouldn't it win?

How does one free software program replacing another constitute the death of free software?

vi or emacs? Not going to happen, these are among the most portable software that exists.

About Chrome or FF that's only depends on the developers who use the product, I doubt that Google will ever target Mir, actually I believe that the GTK+ backend could target Mir and then boom, all GTK+ applications will then run on Mir, including Chrome. If Google ever comes to target a system with their browser this will probably be Chrome OS.

Firefox is another beast, but unless there's a decision to abandon Linux altogether there'll be devs working on it, or on iceweasel, it's Debian cousin, and this will probably run/compile in every Linux (and BSDs?) out there.

About 3D this is a shit area, I believe this move signals that Canonical will try to sell ARM computers in the future, there's a know lack of free drivers for GPUs in the ARM world. To see what I mean you'll only have to search for the Raspberry Pi "open source" GPU drivers debacle to see how things are going. This is not a criticism of Broadcom who has it's trade secrets to protect, but there the situation is far from perfect and always will be.

Hey VLM, could you take a look at this [1]? I wanted to get your advice on something, but it got buried. Thanks!

[1] http://news.ycombinator.com/item?id=5289406

For better or for worse, market share was always Canonical's goal. See their bug #1.

If Linux idiots weren't senselessly married to terrible in-group traditions and the status quo, it wouldn't have taken 20 years (or Canonical) to replace the clusterfuck that is X.

I use Linux and I don't even use a graphical environment most of the time because all it does is slow me down. (When I do use one, it's Ratpoison.) The most effective Linux developers I know work similarly. I believe this is a large cause of us being "senselessly married to terrible in-group traditions and the status quo". I don't know if you're calling all Linux users idiots or what, but I consider Linux idiots as the people who use Ubuntu and Unity and don't have any idea whats going on underneath. They obviously aren't going to be doing any replacing of X.

Hang on, what have we got here?

TD;LR Linux is not one thing. Various projects are not mutually exclusive

We have GNU/Linux with a huge array of applications all licensed in such a way as to allow forking and customisation.

We have different groups of users of desktop/laptops; sysadmins; developers; scientific users; people who use administered desktops at work (e.g. French police &c)

We have a smallish and very pushy company in the UK (Canonical) who want to build on GNU/Linux to produce an operating system (Ubuntu) that will work on TVs, phones, tablets, fridges &c. Their target audience won't be developing alternatives to X, and probably won't be writing bash scripts or anything. (That isn't to say that some their users won't be doing those things).

We have a much larger company in the US (Redhat) who provide a stable conservative desktop as a compliment to their commercially supported server OS (RHEL). They part fund a desktop system that has changed radically recently (Gnome).

There is a German based company that make an enterprise desktop OS and server OS and who contribute to the development of a different desktop environment (KDE)

We have a not-for-profit foundation type entity who push out a version of GNU/Linux that can run on a huge range of architectures (Debian)

There is this thing called 'upstream'. Small projects push out all kinds of applications including alternative desktop environments.

So Canonical does its thing, some of the stuff they do may get back into the mainstream GNU/Linux, some may not. Some hardware manufacturer may or may not take up the system. I may be able to dock my phone to a keyboard/mouse/monitor and write documents in LibreOffice and edit podcasts/photos in a couple of years or 5. (bring that on).

The others carry on as they do now. You may have to choose the graphics card more carefully in future but I hope not. I suspect workstation class computers with separate monitor/bases will get more expensive and specialised anyway, and I imagine someone will do a 'high end workstation-OS' just for them.

I am not calling all linux users idiots.

I made a picture for you to clarify:

http://i.imgur.com/ulGCde2.png (not to scale.)



The problem isn't Linux idiots (whatever that means) or their cautionary tendencies, it's Canonical.

Care to elaborate? Your comment reads as a sentence-long "NO, YOU!" right now. What about Canonical is a problem, and who are they causing the problem for?

As a "Linux idiot" myself, I hear your disapproval of the parent post, but am curious to hear the details that support your opinion (I might be able to learn something from it).

This is how I parsed the conversation:

    pilgrim689: Why is everybody hating on Canonical for
                ditching X11?
    sneak: Because they're "Linux idiots" who love X11.	
    ihsw: They don't love X11, they just hate Canonical.
ihsw is saying that the change isn't getting negative reactions because of the change itself, but because it is being made by Canonical.

edit: I accidentally using the wrong verb form.

I thought everyone was hating on Canonical for ditching X because rather than support the fledgling standard replacement for X, Wayland, they once again (just like with systemd vs upstart, compiz vs mutter/kdm) go off and do their own thing.

And if it turns out like either of those, they end up with an inferior barely maintained product that gets pushed to the sidelines once they show it off on a showroom floor.

Skimming the article, Canonical has some legitimate criticisms of Wayland, and are working on something that doesn't have those problems. Yay, this would be good! Except that it's by Canonical, and as you said, "Canonical never delivers". (That wasn't a quote from you, it was an iteration of "OP (never) delivers")

What I said was that when Canonical does deliver, they let the delivery wither and die without support because they are spread like a teaspoon of mayo thin on their footlong distro sandwich. See: Upstart, Compiz, Software Center.

We agree, I just called that not delivering.


The linux idiots have one glaring example: PulseAudio

Throw away user experience, system stability, judicious use of CPU for what?

It certainly improves certain details of the user experience, when it works correctly, also usually consuming more CPU than flash player

> It certainly improves certain details of the user experience, when it works correctly, also usually consuming more CPU than flash player

You make a sweeping statement which is not true for every system. Looking at what happens playing a Youtube music video, according to htop:

- pulseaudio is using between 0 and 1% CPU

- plugin-container (wrapping Flash) is at 3%

- firefox oscillates between 20 and 40%

Now, my understanding is that, at the time pulseaudio came out, it was using untested functionalities in audio drivers which, for many chipsets, did not actually work and progressively got fixed. There was the same problem with graphics drivers when KDE 4.0 was introduced. Maybe you're out of luck and actually have a defective driver?

Thanks for this insightful answer

I'm not sure I have a bad driver, the last two chipsets I used were Intel (and no weird audio devices), works fine with other sw. Latest distro tested: RHEL 6 and Fedora 16

For me, PA would stay around 5%, 10% sometimes, and even more depending on the task. Not to mention crashing every week (at a random moment)

That's curious. You might be a victim of this bug:


I'm not sure if this could affect the CPU usage. I hear that you can play with the resample setting to lower it, but I haven't had to do that on a 5+ yo desktop.

Hum do you mean HDA by Intel ? If so, look closer because HDA is an umbrella for a lot of codecs.

Edit: to be clearer, HDA does not automatically mean Intel chip. I made that mistake before.

You should avoid RH based distros for Desktop related activities. Perhaps you should try Ubuntu, Mint or Suse.

Nonsense. Fedora is a fine desktop Linux... I run it on a woefully underpowered laptop which I use for development, and it is every bit as usable as, say, Ubuntu (which I use on my corporate laptop). In fact, I'd take my Fedora/KDE setup over this Ubuntu/Unity setup any day.

Am I the only person in the world who has never had anything but a pleasant experience with PulseAudio? The first time I noticed that I was using it, I had already been using it for a year...

No, you're not the only one. I'd been forewarned of the awfulness of pulse, but didn't even notice its installation until later, when I discovered all the useful things I could do with it: play music over the office Apple airplay thingy; use two sets of headphones simultaneously; send audio to another computer in the house. Without any typing, searching, or rebooting,

I remember OSS and Alsa, and I don't miss either. As far as I can see, it makes Linux audio work like any other computer in this millennium.

I haven't had any recent trouble with it. Back in the early days, it didn't always work well with SDL, but that was fixed years ago. I've been able to crash the daemon with stress tests, but it seems to restart automatically.

I can't understand all the dislike of PulseAudio either. It certainly has a nicer API than /dev/dsp.

My most recent attempt to use PulseAudio (as it came via Ubuntu) had the quirk of not activating the headphone jack when something was inserted. It dutifully turned off my speakers, but no sound was available elsewhere.

I previously had stock Debian on the machine, and everything was working fine. A half day's effort was put into toying with ALSA configs that Pulse was deferring to for whatever driver choices it made. Dozens of help articles or threads where everyone else in the world could proclaim "Thanks, that fixed it for me!" left me with less functionality than I started with.

The only thing Pulse has ever done right for me was was help to record some loopback audio on an otherwise-crippled sound card.

No, there are a decent chunk of users for whom it works fine. Maybe as many as 95%. But there's also a substantial number of users for whom it introduces major problems - which would be more forgiveable (at least to yours truly) if it provided any visible advantages over existing systems.

(After a few months of trying to make pulse work I left for FreeBSD, which works beautifully and avoids a lot of linux's change (seemingly) for the sake of change)

Let me ask you some questions then:

- How often do you reboot your computer

- What audio softwares you use the most

For me, PA crashes around once a week (and this is with the latest version I tried), or just plain stops working (like muting the sound, garbled sound, etc)

Not to mention the delay that happens while switching songs in mplayer for example. Use ALSA and song switch is instantaneous.

> - How often do you reboot your computer

Once every two or three months, when I remember to apply updates.

> - What audio softwares you use the most

Only flash (youtube, primarily. Also youtube's html5 player probably, I haven't paid much attention to that) and mplayer. A few years ago, XMMS2. I used to use an XMMS2 client I wrote for music, now I just use some bash scripts and mplayer; I can't say I've noticed any delay while switching songs with mplayer or XMMS2.

When I play music from my laptop through my raspberry pi connected to my TV with pusleaudio, then I notice an audio delay. That is the only sore spot pulseaudio has ever presented me.

Very interesting. I thought some instability may be due to PA staying up for a long time, but apparently not.

About the delay, try this: mplayer *mp3 (in a directory with mp3 files, of course), then press >. There should be a significant delay in switching songs (like, 0.5s). Make sure mplayer shows it's using the [Pulse] driver

With ALSA this is very fast

"Not to mention the delay that happens while switching songs in mplayer for example. Use ALSA and song switch is instantaneous. "

There's a good possibility that this is mplayer's fault more than PA's.

I agree with. The only problems I've ever had with it have turned out to either be problems with the actual hardware or problems with the clients, not problems with Pulse Itself.

I will admit having a lot of a hate for figuring out which audio 'profile' is which...

If you've got the plugs for 7.1 and a normal line in as well as possibly a digital one, they you've probably got 23 choices of profile for your built in audio device. Add in a GPU with HDMI and the NVIDIA driver and you probably get at least another 7+ choices all labeled Nvidia... I find I frequently have to guess multiple times whenever I hook up a TV via HDMI.

But it works quite well if you define works as works without adding on any usability riders.

No issue on a couple of machines. Running with a call to start-pulseaudio-x11 from my .xsession, works fine. I believe running it as a system daemon is not recommended, though.

PulseAudio is good now. It was a different story for me back in 2007/2008 though.

Not this again... You could have complained about this years ago, but it makes no sense today. PulseAudio got fixed.

> user experience

you can set up PulseAudio for LAN audio streaming with padevchooser (GUI). Could you do this with ESD? Arts? Tell me about this audio streaming solution with super UX for Linux that you know...

Parent seems to have been experimenting this issue recently. Saying "PulseAudio got fixed" does not mean "it works perfectly for everybody" (see https://bugzilla.redhat.com/show_bug.cgi?id=813407 ). Most people will care more about getting a sound system which works without crashing and not sucking up CPU before getting sound system with streaming. The fact that you and me have had a good experience with it does not mean other people are not having issues with it (just as their bad experience may not reflect the overall quality of the software).

That's a Fedora bug. It wouldn't be bleeding-edge (enough!) if it didn't crash. Running Fedora and complaining in a public forum about crashes (without saying the distribution name in the first place!) makes no sense to me.

"without saying the distribution name in the first place"

RHEL 6, Fedora 16, Mint 11 (while it was not obsolete), Ubuntu Maverick Meercat (amongst others)

Oh ok, so Fedora users just have to suck it up and having a crashing system? Congratulations, you don't know anything about distros.

"you can set up PulseAudio for LAN audio streaming with padevchooser (GUI)."

Yes, and? I understand this is a useful feature for some people, but I'm just interested in listening to music and PA does not me allow to do that without frequent crashes and bugs

I don't care about the features of a car if it stops working at random. And yes, this has happened on every distro I tried since PA started being bundled. Every single distro, every single machine I've had only a stable system with PA OFF

Bugs can be fixed. Pulse is newer than Alsa and more complex, you should expect bugs. If you don't have a problem with the implementation details of PA (software mixing, optional software equalizing, networked audio, stateful audio adaption to devices) then just report the bugs and make it better. Don't whine and try to fragment the Linux space any worse than it already is, especially with Mir's announcement.

The real question is why did you need something newer in the first place ? Are years of bugs, misconfigurations, random crashes worth the minimal improvement end-users will get in the end (when everything finally works for everyone) ?

Yes, pulseaudio has all sorts of awesome esoteric features. You can stream sound transparently from a linux box to a windows one. That's kind of useless if it brings down your computer.

Yes, I should expect bugs soon after release

But PA is 8 years now.

I don't remember any software being so problematic as PA in the Linux world. Some have more problems, yes, but they usually have a limited deployment.

Flash Player? You didn't have problems with that?

Of course I did. Still less frequently than PA

> you can set up PulseAudio for LAN audio streaming with padevchooser (GUI). Could you do this with ESD?

Considering that was pretty much the point of it, of course.

The point of ESD? It wasn't nearly as user friendly if I remember correctly. But I was a KDE user.

Oh, didn't realize you were asking about the UI specifically. I believe you would just set an environment variable of where you wanted the sound to go.

The part I liked was that you would start ESD and everything just worked.

I conducted an experiment on my last laptop, having noticed PulseAudio taking up more CPU than my media player. I found, quite consistently, that disabling PA led to my media player using more CPU on its own than the sum of the media player and PA when both were running.

I'm going to guess you haven't used pulseaudio in the last 3-4 years?

> The linux idiots have one glaring example: PulseAudio

PulseAudio works wonderfully for me. It allows software mixing, something that's never been possible for me prior to PulseAudio, and it never crashes.

It also gives me a single place to control volume on a per-app basis (pavucontrol). That's not something I had prior to PulseAudio, either.

I can also keep a single desktop session going for months.

I agree with this, I have similar experiences with pulse. I also use the pulse equalizer, which is super easy to set up.

I like it. I think it does things right, and they can optimize it more over time. It is better than having a fast audio subsystem that just can't do what I want. Pulse is a kitchen sink that at least covers its bases.

Basically, Canonical appears not to be publically consulting with the other distros (e.g., RH, Gentoo, Arch) about their changes. If they were, and there was a general consensus of a plan, I would personally not be bothered at all. But they seem to be content to pursue unilateral forking of what Linux means, and IMO that's not ok.

Canonical is a business, I believe they do care for Linux because otherwise I don't see why they will be part of the Linux Foundation. But they are a business.

And like most businesses they need to make unilateral decisions. The only difference is that Canonical makes these decision public knowledge while it is trying to achieve a hectic roadmap, for business purposes.

Ubuntu has been a Canonical project that is based on Linux for so many years. It is hardly community driven OS that people believe. I think it is a good thing. They are moving forward in some direction. They will make mistakes. Hopefully they will learn out of it. There is nothing wrong in it. In general Linux userspace is very fragmented. Organizations and developers seldom agree on anything. Ubuntu can't provide a true desktop user experience in such a space. And there are always distributions like Arch Linux which offer true freedom and completely stick to Linux specs.

The Layman doesn't want a desktop at all; they want a smartphone or a tablet. Want to drive mass adoption of Linux? Write a killer app for Android.

The folks who do want desktops, want those desktops to run software like MS Office or Adobe CS. If Linux can't do that, it does not matter how shiny it is.

You are sacrificing your objectivity, there is no reason other distros cannot be 'just as shiny' as Ubuntu or Mac OS.

Furthermore it is horrible because Canonical has a history of secrecy and authoritarianism and they absolutely go against the spirit of open source software development.

If the "spirit of open-source software development" can't stand for distro maintainers to have strong control over what goes in their own distros, this open-source spirit sounds pretty authoritarian itself. If (for example) I want to maintain my own window manager for my distro and I provide the source under a free license but don't accept outside contributions into my own distribution, what is wrong with that? Isn't that part of the freedom of free software?

Agreed. So many geek people hating Ubuntu this days, probably because they feel "betrayed" as they make Linux more user friendly.

Most of them had contributed zero to Linux, so I don't know where this entitlement sense and emotional response comes from.

If you don't like it, go use whatever you want.

In my experience distros attempting to keep things as close as possible to packages upstream (except for bug fixes) are always the most useable, stable, etc. So I sort of prefer the "authoritarianism" from upstream.

Additionally, an upstream package has authority only on ONE component. The distro has authority on ALL components. That makes ALL the difference.

I actually agree. I like Debian better than Ubuntu because I feel like its packages are generally more reliable for my use. But that doesn't mean I would declare Ubuntu to be "horrible" — just not to my taste. I respect Ubuntu's ambition and hope they are successful in what they are trying to accomplish.

Yeah, I'm not saying Ubuntu is horrible by any means, but I have better luck with distros taking another direction, for the reasons mentioned

especially since people have been slamming canonical for years for being mere repackagers who were not contributing any significant code to the ecosystem.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact