Maybe this is a good idea, I don't know about X/Wayland enough to say. But it worries me that Ubuntu is increasingly striking out on its own. What I like about the GNU/Linux ecosystem is that a lot of distros share a lot of common underpinnings, and everyone benefits from a large community fixing bugs and improving those underpinnings. It's also less knowledge to have to keep in your head for system administration stuff. (Which is still necessary in Ubuntu, regardless of what the "it just works for me" people say.)
Maybe this is the kick in the pants Linux needs to increase adoption. But I would much rather know GNU/Linux, not Ubuntu. Now Ubuntu is standing alone with Compiz, Unity, Upstart, Launchpad, and Mir, all pretty fundamental pieces of the core system. In a decade, will switching from Ubuntu to Debian be as big of a culture shock as switching from Windows to Linux?
In terms of user interface have those standards delivered?
I remember fonts being really screwed up for a long time. Longer that it should have been. Just getting anti-aliased fonts was major deal.
Then graphics drivers are a mess. I blame manufacturers of devices here. But still treating it as a black box I just see that is inferior. I know Linus flipped of NVidia but at the same time is a chicken and egg problem. Why should they bother to improve their drivers? To make GNOME 2 desktops look shinier? You criticized Compiz, ok, what is the alternative?
Google did pretty much the same with Android. We have surfaceflinger, dalvik, custom IPC mechanisms. Now they are trying to slowly merge some of them in.
> Maybe this is the kick in the pants Linux needs to increase adoption.
That is what I think. Here is a company that tried to push and everyone resisted and said "you must adhere to standards defined by a 20 year old plus graphical server". I think that is silly. Let them innovate. I for one, am excited for them.
If Canonical were good at it, everybody would be saying "yeah, go for it!"
But they're not.
Every time they strike out on their own, they seem to end up producing overly complex, badly performing, messes, with tons of issues, that everybody hates. bzr, unity, ...
The only reason people use any of their stuff is because ubuntu pushes it on their users, and people with little experience end up using it by default. Sound familiar...?
[Worse, with Mir, they seem to going for the lockin effect: "mir will integrate especially well with unity"... >< ]
That perception, that Canonical is not competing on fair terms—almost anything they come up with, no matter how awful, will come with a built-in user base—is one of the big reasons they get so much flack for this sort of thing. If it were just some guy in his garage hacking on his new display server after lunch every day (like, you know, Wayland...), and gaining users the hard way, by being better, then nobody would blink an eye.
For that reason, I think people expect more maturity from an influential company like Canonical. They expect them to play nicely and work with others (e.g., if they think they've identified issues with wayland, work to fix wayland; or cooperate in designing a next-generation replacement) instead of striking out boldly on their own every 15 minutes.
Again, if they had proven really good at it, they'd get a pass to some degree—but they haven't. Instead, by making technical merit less of a factor in determining what becomes widely used, they're effectively reducing the overall quality of FOSS. Can you blame people for being annoyed?
[if Canonical were] gaining users the hard way, by being better, then nobody would blink an eye.
Maybe they are "being better".
I have no doubt that technically Wayland is better than Mir, but for me (as a user who lost interest in configuring Linux back when FVWM95 was considered interesting) I don't care.
I use Ubuntu (including Unity!) because it works. I never, ever want to deal with weird configurations to work around some bug, and for the most part Ubuntu lets me avoid that.
For me, "better" means "it just works". No other distro does that as well as Ubuntu.
(BTW, Unity isn't that bad once you've used it for a while)
I agree. I for one, also like Unity. My mother who is in her late 60 also learned it quickly now likes it. Interface is less cluttered and easier to use.
At first there was a small learning curve, but I don't remember often thinking "hmm, I wish I had my GNOME 2 environment or my KDE environment back".
Maybe I too got old and spending hours tweaking my GUI settings just stopped being fun. I just want to fire up the machine and get work done.
Can you elaborate on what you dislike about bazaar and Unity? Or how Canonical is able to "force" a vcs on its userbase?
I've used Unity since it was released, never had any issues, and generally think it's the best GNU/Linux window shell around.
Bazaar is also ridiculously simple, and I've started using it as a front-end to every other vcs. The support for git is a little rough, but I can do in one command what git users have to do in four, so I deal with it.
I think that the vocal minority of Ubuntu power-users (which is sort of a laughable term) hates change. This is what made Unity such a "failure" when really it's a pretty great system.
Launchpad supports Bazaar only. That's one of the "top ten reasons" to switch to Bazaar according to the docs [1].
That said, I don't really think Canonical is "forcing" anyone besides playing the "defaults" card in the distribution they control, and even then you can simply not use the stuff is installed by default (although I think that making the desktop dependent of some applications that I may not want to use is a little bit against the bazaar philosophy of Linux distributions).
Launchpad, like Github, is offering a service and that too free. It is not a valid argument. It would be like asking Github why don't they support other VCS.
> making the desktop dependent of some applications that I may not want to use
Here's my power user setup for dual monitors with Unity:
1. Plug in dual monitor to HDMI port.
2. Use Unity.
What did your colleague have trouble with?
(Four might have been slight hyperbole, but git is a usability nightmare. I forget what exact task I can do in one command that it takes 3-4 to do in git, there's probably some set of absurd arguments to do it in fewer, but I don't know them and am not willing to spend hours reading tutorials to discover what magic buttons I have to press to get git to do what I want.)
I remember him trying to move a window to the other monitor and having bounce back repeatedly. Very funny to watch.
> git is a usability nightmare.
No argument here. The design of the commandline UI is terrible ("git checkout" is the first offender you're likely to meet). However, it's powerful enough that I'm using all the time, occasionally bridging with SVN when needed. Once you get hooked on cheap local branching, you can never give up.
Eh, Bazaar has cheap local branching as far as I care about. And yes, I don't have to deal with git checkout.
Window management can be funny -- I think there's a bug in compiz where a display will "reject" windows if they're too large (when minimized) for the display. This causes the drag-to-top motion to hit the display the window is currently on, and so it goes back to the original display. Using control-alt-numpad5 works, as does unmaximizing your windows and making them smaller than your smallest display.
I beg to differ. I used Redhat, Mandrake, Gentoo, Debian, and now Ubuntu and I have been happily using Ubuntu for a long time now. Unity is great. I barely need to do any configuration with it, everything just works. Under the hood it's still good ol' old-school Linux.
I have tried Mint on 2 other computers for all the talk about Mint being the next great Distro, I find it to be far less stable than Ubuntu. I have also started only using Ubuntu's LTS releases with a handful of PPAs to get more recent packages. Helps with stability a lot.
One of the largest issues that I can foresee with this is proprietary drivers for graphics cards. NVIDIA has delivered high quality drivers for Linux/X11 for their cards and tend to release new drivers quickly when new X11 versions are released. Are gfx hardware manufacturers expected to support separate drivers for Linux (X11) and Ubuntu (mir)? and potentially distributions that use Wayland? I can only imagine how AMD/ATI will handle putting out graphics drivers for some other windowing system that Ubuntu decides to use.
From TFA: "Right now, Mir does not run on desktop hardware that requires closed source drivers. However, we are in contact with GPU vendors and are working closely together with them to support Mir and to distill a reusable and unified EGL-centric driver model that further eases display server development in general and keeps cross-platform use-cases in mind." Good luck with that. How many years did it take to get decent AMD drivers for Linux/X11? The catalyst X11 drivers are still currently a mess.
> Ability to leverage existing drivers implementing the Android driver model [1]
This is huge for ARM; very few vendors bother developing any sort of an OpenGL driver for X11, and drivers for same GPU often not only SoC specific, but sometimes device specific as well.
> What I like about the GNU/Linux ecosystem is that a lot of distros share a lot of common underpinnings, and everyone benefits from a large community fixing bugs and improving those underpinnings
Exactly. Standards exist in the GNU/Linux world for reason. Mir, Unity, etc. all don't seem to play too well with other distros and are very tightly integrated with Ubuntu; if this continues Ubuntu might turn into a slightly more open OS X, which isn't good at all.
Standards are great and all, but holding on to those standards is what's keeping back the UX of linux. Gnome3 is familiar, but very dated. X is supported by all, but it's a tangled mess that everyone hates.
Yes Ubuntu is straying from the pack, but I'd rather have them stray and the rest follow when they find something good then have the entire linux community sit in the same place they have been for years. Regardless of what path Ubuntu takes, I think it will be good for the linux community to have that diversity.
Have you used Gnome 3? Familiar and dated are not words I would use to describe it. The Gnome team is clearly focused on user experience and without rendering judgment as to whether or not they've succeeded (disclosure: I think they largely have), it's immediately obvious to new users of Gnome 3 (see Torvalds for the canonical case) that the product is not clinging tightly to familiar design patterns.
RH is doing similar stuff, although they're more deeply involved, and while I don't especially like how they impose some of their technology, at least it's generally well though out.
That's what open source is all about my friend! First distro I ever used was Debian, only as a live CD though because those were back in my WinXP days. But I immediately fell in love with apt. Then about 2 years ago I moved to Ubuntu, its nice but I didn't love Unity (didn't hate it either, just different). A friend recommended Mint, and I have been hooked ever since.
Much more open. Like actually open. There is a reasonable case that there has been too little innovation in the last 10 years. I first used X Windows more than 20 years ago and it was amazing then but holding stuff back with old standards now?
There are very few standards in the GNU/Linux world. GNU and Linux has, by themselves, become standards through ubiquity. POSIX is one, yes, but it's woefully inadequate when you've gotten used to all the bells and whistles any modern GNU/Linux distro puts on top of it by default.
And what about the BSDs? They certainly doesn't play well with Linux, or vice versa, but that's hardly an argument for calling either an agent for closedness.
Placating the tech crowd by giving away fragments of low level code is really just marketing and a way to retain talent that would otherwise refuse to work for them. It's not as if their users can actually use any of that stuff to change how their macs work (without switching to a different OS or browser).
WebKit for one has fairly wide adoption outside of pure Mac OS X. As for the rest of it, just because something is open source don't expect many people to use it.
Yes but then it's also being developed by far more companies than Apple.
For 2012 Google submitted roughly twice as much code to Webkit as Apple (which came second), apart from them we have smaller yet notable contributors like RIM and Nokia (7% and 5% respectively).
Let's not forget that reduced maintenance costs are often touted as a reason to opensource code in the first place. And when you consider it was developed by Apple and so presumably fit there needs well when released it as working code, it's not surprising that adapting it to new use cases involves more work than maintaining it.
Actually, I was just pointing out that they are not open in any meaningful sense of the word - their products are very, very closed and controlling and that's how they like it. Comparing them to Canonical is in my opinion very silly for that reason, it's apples to oranges.
Didn't Apple buy CUPS specifically so that they could keep parts of it proprietary?
They're definitely only funneling money into LLVM so they can drop their dependencies on GCC, which will probably have a net harmful effect on the free software culture.
It's sort of obvious? I can't see any other reason Apple would have for it, other than making compilers researchers have slightly easier lives. It'd certainly make sense after they were forced to open source the frontend for Objective-C.
Apple had two technical needs that GCC could not reasonably satisfy. One was to compile Core Image filters. When you make a chain of filters, they treat it as a single complicated filter which they JIT compile and optimize for your GPU or CPU (I believe they decide at run time which to target based on what will be best for the particular complex filter you have defined).
Another technical need they had was for a compiler system that could be easily and tightly integrated with other tools, such as IDEs and debuggers.
Either of the above would have required extensive modifications to GCC, and they would not have been able to get those modifications into the upstream. The result would be Apple would have to maintain a fork of GCC. They'd be spending a lot of effort porting things from upstream into their fork.
1) Replace GPL code with BSD code
2) Release a subset of the features to open source community
Why would Apple be so allergic to GPL unless they want the ability to close stuff? If really you think Apple would stay open if they dominated the market look at the iPhone.
This is only "hurting free software" if you are a complete radical like rms, who believes that any closed source software is an affront to humanity, blah, blah. Otherwise, it doesn't matter what Apple closes, as everything that was Free before, is still Free afterwards. If you don't like Apple's stuff, fork it. If you don't like closed source software, don't use it. I don't see anybody putting a gun to anyone's head and forcing them to use Apple stuff.
Your metric is essentially "no knowledge is lost from the Free world." However, this means that all free software developers could die tomorrow and your metric would be satisfied.
A better metric is "the most free software that could be made is made." Switching from a GPLd compiler platform to a non-copyleft "freemium" Apple platform is worse for this.
If you don't care about free software as a political goal, there is no purpose to have this discussion in the first place, as you don't care about free software being hurt.
I think you're setting up a false dichotomy here. "You care" or "you don't care". Reality is more nuanced than that. Also, the idea of "all free software developers dying tomorrow' is just hyperbole, since the chances of that happening are essentially nil.
Yes, we'd all like to maximize the amount of F/OSS software in the world. F/OSS is a Good Thing, and I founded a company based around F/OSS for a reason. But it's not this horrible tragedy / affront to humanity, if Apple (or whoever) takes a step away from a purist "Free" software position.
To put it another way: no one owes you (or me, or anybody else) a world full of all the Free software we want. And even more so when plenty of people in the world don't care about software freedom. As long as those of us who do care have the option to fork and continue development of projects, then the actual freedom remains, as far as I'm concerned.
> Also, the idea of "all free software developers dying tomorrow' is just hyperbole, since the chances of that happening are essentially nil.
That is exactly my point. You proposed a metric for the health of the free software culture: "Nothing that is free now is non-free in the future." This is a bad metric because if you eliminate all free software development, it reports "everything is okay."
You proposed a metric for the health of the free software culture: "Nothing that is free now is non-free in the future."
It seems to me that you're turning something analog into something binary. "Nothing that is free now is non-free in the future" is true, relative to any particular project, and is - as a worst case - not so horrible. But applying that to "free software culture" in general, as a comparison to the idea of ALL free software development stopping, doesn't sound reasonable to me. It's like you're suggesting that, say, Apple, moving away from GPL'd "Free" software towards BSD software (and possibly a "free core" model) automatically implies that everybody else does too. But that's just as likely to happen as your hypothetical of every Free software developer dying tomorrow.
The "every free software developer dying tomorrow" is a hypothetical statement that has nothing at all to do with Apple. It was in my comment because it's a great way to illustrate that even if things aren't explicitly subtracted from the Free world, the Free world is harmed if its growth is slowed.
OK, but "harmed" is a broad term. I'm harmed if I stub my toe, but I'm also "harmed" if I fall of a bridge and land on my head and crack 4 vertebrae. But there's a big difference between those things.
And anyway, my point was (at least partly) that the growth doesn't necessarily stop because of - for example - the Apple deal (or something like it) because people can always fork. And if the last GPL'd version of something that goes closed is popular enough, it gets forked. See: Nessus[1] / OpenVAS[2].
how is it "slightly more open"? from what i can see, all the code in ubuntu is open-source to the fullest extent of any accepted definition out there, including the full freedom to fork and start a distro that pulls from both ubuntu and third-party patches if you feel that canonical is not accepting said third-party patches readily enough.
Which standards are these? Besides using the kernel compiled with gcc and using dynamic linked ELF binaries as their default and a libc on the system the only standard that I ever knew of was LSB, which I believe does not include anything about graphics (I can be wrong).
They're defacto standards. Not formal standards. The fact that you can compile and run a GTK program that will run on most linux and even other unix like systems without problem.
It means that someone creating a program, proprietary or open source, can, with little effort run that program on most linux distros.
We have quite a handful of internal GUI tools, some use Qt, some use GTK, written in python and some in C or C++. The GUI stack makes up the biggest stack here, but they use a few other libs as well, e.g. cURL, openssl, gsl.
All these are readily available, and we run these on various versions of RHEL, Debian , Ubuntu and FreeBSD beacuse of the defacto "standards" provided by these environments. Even porting these (at least some of them) to windows, where many of these libraries are available would take too much time because of minute details - there's other real problems we need to focus on. We hope we still can run these programs on Ubuntu in the future though, but it will not take that much to make it infeasible
I actually do share your fear. That's the reason I run Debian. I really hope this does not destroy Ubuntu as a Linux distribution (which I believe is the standard you and the poster I answered was referring to), as long as it remain based on Debian I believe there's little to fear.
> Standards exist in the GNU/Linux world for reason.
That reason being that they grew out organically. And the landscape isn't even unified as it is! Mist distros come with Gnome, but you can run KDE if you are a bit more leet, or if you want something more lightweight you have xfce. Unless...
I'm happy Ubuntu is showing some vision for what their OS can become and work towards that goal.
They are not talking about Ubuntu being closed. They are saying that Ubuntu would drift toward being more like Mac OS X, but be more open than Apple is with Mac OS X.
> Maybe this is the kick in the pants Linux needs to increase adoption. But I would much rather know GNU/Linux, not Ubuntu.
I'd rather know Unix, not GNU/Linux or Ubuntu. Maybe Ubuntu diverging a bit from other Linux distributions, and from GNU, will encourage people to write portable code.
Canonical might have problems with the current de'facto toolchains of the GNU/Linux ecosystem, or it is to large or unmaintainable for what they want to do with it. Like you said maybe its a good idea, maybe its not.
Starting clean you can take all the knowledge gained from previous open projects and create something fresh that is more lightweight, faster and easier to maintain. But will this new components live up to it expectations? Only time will tell, and Canonical is willing to back that venture.
I also believe one need to stop calling linux, linux or the lesser adopted but more correct term for me GNU/Linux. Call the operating system you use by its distribution name, that is the actual ecosystem you using. Which yes, is part of a greater Open Source, Free Software ecosystem.
But to achieve inovation one needs to boundaries, and at the moment Ubuntu is the OS on the desktop with most commercial and adoptions success.
1) Make the GNU/Linux stuff work on Android. That's what Tizen does, AFAIK. It's a lot of work. Advantage: everything works with it, and it's well understood.
2) Glue stuff up to work with whatever Android does. That's what they're aiming for. Advantage: less work.
Why didn't the Android team make the GNU/Linux stuff work on Android; Because it is a lot of work?
Why should another company then spend resources to do that work.
I don't understand what you mean with your second statement. But that might just be because I'm dead tired. But I can not stop myself from reading and typing!!1
> Why didn't the Android team make the GNU/Linux stuff work on Android; Because it is a lot of work?
Because it's GPL licensed. That's it.
Google has spent a lot of time and work ripping out perfectly good pieces of code that are GPL licensed and replacing it with (sometimes inferior) BSD or Apache licensed pieces. At some point, the only piece of GPL code will be the Linux kernel.
Device manufacturers' lawyers are very paranoid. That's (part of) the reason Android has its own BSD-licensed libc, "bionic", instead of glibc.
When Google ported Chrome to GoogleTV, which runs Android, they ported glibc, too. Apparently, it was easier for them to port glibc to Android than port Chrome to more limited bionic.
Yet in the end, you pull the entire Linux FOSS community towards whatever whimsy Google is going on at any given time. If they suddenly redid the graphics API and threw out their current binary blob batch, the FOSS space would have to get back up and try again after getting the rug pulled out from under them.
I don't see it as good if Google can lead FOSS desktop development around by a leash.
I switched my personal computer from Ubuntu some time ago. It's kind of startling when you see the divergence from core Linux tooling that Ubuntu's been doing.
I believe this is the kick in the pants my team at work needs to evaluate moving to Debian deployments instead of Ubuntu.
They are not changing the path of Linux, but the path of graphical UX on Linux. I think it's a good thing. The GUI is simple enough for my grandma to use, yet I can open a terminal and have a very familiar experience. I see no harm here, as they will not diverge from Linux itself.
Critically, they seem to be putting a lot of effort into being inter-operable with Wayland and X. We have two major widget toolkits and it's not a problem because I can use applications from both without any trouble.
Wayland is already mostly there, was put together by experienced X developers who've given up on trying to make X do everything modern display environments work. The main problem with moving away from X has been the lack of a single, demonstrably better target to move to. Mir will fragment that effort, and the rationale seems to be "NIH".
> What I like about the GNU/Linux ecosystem is that a lot of distros share a lot of common underpinnings, and everyone benefits from a large community fixing bugs and improving those underpinnings.
I don't know if this is necessarily a bad thing or not. I mean, sure, it's handy to be able to switch distros seamlessly, but has that ever really been possible (aside from going from, say, Ubuntu to Debian, or Fedora to CentOS)? Going from Fedora to Ubuntu or vice versa, for example, has never really been a completely seamless experience.
It's always been the case that, in the strictest sense, "Linux" is a kernel, not an Operating System, and each distro is really it's own OS. That they had a lot in common was a fortunate bit of happenstance in a lot of ways. Now, they start to diverge, that may mean more competition, which should lead to faster innovation and even more progress. And as long as everything is F/OSS, the distros that pick a bit of tech that "loses" can always switch to the "winner" later.
I'm not saying that it would be totally pain free, mind you. But I can see how this sort of move might benefit everyone in the long run.
From the article, and I'm sorry if this seems condescending, I would truly love to have a conversation about this, but I believe your question is to vague:
* The input event handling partly recreates the X semantics and is thus likely to expose similar problems to the ones we described in the introductory section.
* The shell integration parts of the protocol are considered privileged from our perspective and we'd rather avoid having any sort of shell behavior defined in the protocol.
What worries me is fragmentation of drivers. It can be a very serious problem.
Possible scenario:
Nvidia wait for broader Wayland adoption to release their driver for Wayland. Comes along Canonical and drops Mir on our heads. Nvidia scratch their heads and say - forget Wayland. Not sure if they'll say forget Mir as well - they have better things to do than to deal with this mess, but all this doesn't sound good already.
Consider games development. People wait for better drivers (Wayland)? They aren't coming. Should they say forget this mess (Linux)? Or Canonical think they should refocus all driver development on Mir like Android did on mobile creating a horrible fragmentation again?
I don't think so. The reason we've had Nvidia and AMD support on Linux for so long is because of the large 3d/sfx and of 'gpu accelerated number crunching presence' Linux has.
The end user Linux desktop market has never really been on their radar, perhaps that will change somewhat with the advent of Steam, but really I expect that unless the aforementioned markets adopt Wayland or Mir (with the latter being most unlikely imo) then NVidia and AMD won't target them with proprietary driver support.
we are in contact with GPU vendors and are working closely together with them to support Mir and to distill a reusable and unified EGL-centric driver model that further eases display server development in general and keeps cross-platform use-cases in mind.
I hope the later means Wayland compatible drivers, because if not - it's going to be bad.
My interpretation (and apologies if this is obvious to you) is that they will define certain interfaces between the driver and the outside world, and any manufacturer will be able to provide something (a library or a daemon, I guess) that implements those libraries.
When they say "cross-platform use-cases", I think it means they want to make it easy to build the Mir driver and Android driver from the same source tree - and quite possibly the Windows and OS X driver too. So hopefully they will have some quite general interfaces.
I think the ideal scenario is that the driver interface they define has a very general interface, and then there's a wrapper that turns that interface into an Android driver, another that turns it into a Wayland driver, another that turns it into an OS X driver, and so on. But I don't know if graphics drivers would accept the performance penalty of even a very thin wrapper.
I see all this as a huge waste of effort, unnecessary complications and increasing confusion. The main drive behind this seems to be desire of Canonical to control the development (they can't do that with Wayland), rather than any valid technical reason. Quite disappointing.
Here is what Wayland / X.org / Mesa developers have to say on this:
I don't take the driver situation into consideration and it seems that Ubuntu is contacting closed source manufactures to allow driver support for mir.
I like your scenario because I believe a interesting element is that Valve is trying to port most of its Steam games to Ubuntu and they do have muscle with Nvidia and AMD. And with the rumours that the steambox called Piston would be based on Ubuntu.
So in my opinion it looks bad for distro's not using mir and good for distro's using mir. If that is a good thing altogether is another debate.
And as mentioned above profesional game development to Ubuntu is coming at a rapid pace. Maybe at the price of another horrible fragmentation as you call it.
I don't consider Ubuntu usurping the focus of game development and drivers a good thing. And notion like "using Mir - good for you, not using Mir - too bad, no drivers" doesn't sound good as well.
Ubuntu is already quite isolationist. This will only make matters worse.
Why would you say Ubuntu is taking game or driver development by force or illegally? As far as I can tell there is no other competition in this space, so it is actually very open and its not like Ubuntu is not willing to share.
I agree with the notion, but we still need to see if it will play out that way. For now its only on the table, with a lot of other cards we are not even aware of.
I didn't say anything about "illegally". It's all open - "do what you want". It's the question whether it benefits the global Linux community, or creates unnecessary internal competition / fragmentation and spreading of resources.
Competition for drivers can start (if they won't be compatible), look at Android targeted hardware and see how sick it is now (it's SurfaceFlinger only, try getting Wayland drivers and good luck with that). We don't need the same story repeated with Mir.
Sorry I did a define on usurping and ended up being pedantic because I don't believe its true.
I understand all the concerns about how it will impact the global community; But isn't it generally seen that competition is good, and who's resources is it spreading or fragmenting?
* A distribution who tries to cater to everyone?
* Other companies employees who now need to understand and support technology it doesn't want to use?
* "Linux" users in double quotes because they want to be able to administrate all their different boxes with different distro's using the same old tools of 20 years ago?
* FOSS coders who spend their free time fixing bugs and adding features to projects they like and use?
Call me ignorent and please point out why. But I don't see any of the above as resource issues. The only problem is the "Linux" users and I feel if they start to see themselves as distro's users instead of a name given to a Kernel and Foundation, that problem will sort it self out.
By your account it seems like it already started and Ubuntu might be able to bring in a second or third option.
By the looks of it the greater community wanted Ubuntu to rather spend time in bending tools to do what it needs to do, to benefit the community. Without realising Ubuntu is a product of a company that has goals and can not always take the long road of community inputs, but at least it tries do be open as much as it can.
Which gets criticized a lot because it is a leader in a "linux" environment it mostly created from scratch.
Competition is good when there is choice, i.e. when there is actual competition. When this plays out as an edge case (i.e. monopoly like situation) i.e. stuff like "SurfaceFlinger only" or "Mir only" - it's not good.
Canonical draws its success on the community (Debian and others). With that, being selfish and isolationist, while at the same time claiming to be "the Linux" is not considered respectful. I personally avoid Canonical/Ubuntu because of that.
Would you care to elaborate on why you dislike launchpad? I haven't used it as a developer, but as a user I've found it really useful, particularly the way it integrates with apt-get via PPAs.
I don't understand all the negative reactions. Canonical is recognizing various problems in making GNU/Linux mainstream. They are then innovating at a deeper level (fixing root causes rather than duct-taping) to ultimately attempt to really attract the layman to a mobile or desktop GNU/Linux distro. Devs don't need to target Mir if they don't want to, Linux users can switch to another Debian if they don't like it, and the Layman discovers that Linux can possibly be just as shiny as Mac OS.
Can someone explain to me why this is all so horrible?
Are these people designing a better answer to Wayland because it's really better, or because it always looks easy when you don't intimately understand the problem space? I honestly don't know, but it's a question I'd like answered before I get excited.
I'd like to hope since Wayland has been in development hell for ~5 years now, that if the Mir people are doing their job, they are intimate with both X and Wayland, see where Wayland improved, and where it made mistakes, and fix those.
1) For the past two years or so the community has been led to believe Canonical would be adopting Wayland. After the slow build of anticipation over that, it was dropped out of the blue.
2) Simple distrust of Canonical's homemade projects after the Unity fiasco.
3) Fragmentation of effort to unseat X. The chance of repeating the history of every other display server that was going to "replace X within Y years" becomes that much higher with two competing alternatives marginalizing each other.
If this news had come three years ago I imagine the response would be wildly different.
1 ) I was also under that impression, but coming to think about it who created that expectation, the community or Canonical? I don't know.
2) Calling unity a fiasco, is like calling it a failure. Which I believe is simply not true.
Yes some people didn't like it and moved on or made their opinions heard on forums etc.
But Canonical has a plan with Unity, and at the end of the day it gives a better user experience to me and hopefully a lot of other Ubuntu users.
3) There is reasons to unseat X and there is reasons not to unseat X. But what to unseat it with lies with the people that have certain problems, and maybe the problems Ubuntu face with mir is differently to the problems wayland and its community face.
I personally would like to see two different tools, then one large tool like X, and if you read the article they explain why they decided on mir instead.
I did a quick google for "ubuntu having problems with wayland" and some interesting links got returned regarding developers struggling to get wayland working, so it didn't get dropped suddenly.
It seems that the mir spec is a public statement that after 2 years they are giving up for a alternative, with reasons and a alternative.
With some individuals up in arms with the discusion, but how should Ubuntu have handles it differently, while keeping to a deadline.
> Calling unity a fiasco, is like calling it a failure.
This has gotten singled out so I'd better say that I meant to find a more polite way of saying "shitstorm". It's not a judgement on Unity itself, but I think it's fair to call the reaction at the time a fiasco.
I still use Unity, for some reason, and I am shell-shocked it was allowed to be released.
I've never worried about window focus failing to work properly until Unity. I've never had terminal windows fail to be refreshed when scrolling text until Unity.
Personally I love it. I had planned on installing a tiling window manager, but after 10 minutes with Unity I decided to stick with it. It's the first Linux GUI I've found ok since Enlightenment 0.16, and the first time since I left the Amiga behind I've had an environment I'd say I'm happy with.
And most people I've heard who have used Unity love it. The only people I hear complaining are typically people who expect the Linux desktop to remain static and unchanging the way it was when they first used it.
Unity might be a fiasco with a small subset of old Linux users, but Unity or something like it will be essential for Canonical to keep attracting new users, who are expecting polish at the level of OS X rather than at the level of the Linux desktop of a decade ago.
>The only people I hear complaining are typically people who expect the Linux desktop to remain static and unchanging the way it was when they first used it.
Or who are stuck on older hardware (why was 2d retired, again?)...
Or who think it's utterly silly you have to write a INI (.desktop) file to put arbitrary links in the "dock"..
Or who think that the interface wastes a lot of space...
You've went and generalized a lot of people with legitimate complaints about that DE. Kind of like Ubuntu's developers do.
I despise Unity. I'm only using it right this minute because our company standard desktop at the $DAYJOB is Ubuntu.
Personally I find the state of desktop Linux to be fairly close to abysmal at the moment. I'm not sure we're any better off now than we were with Gnome circa 2000 or so.
I had, and still have, high hopes for KDE once Qt went LGPL, and I do run KDE on my personal laptop. It's not bad, but there are some major pieces of software that don't have Qt native versions. sigh
Regardless of the technical considerations, Linux and its surrounding ecosystem has a long history of many different companies co-operating on core software. Canonical's philosophy seems to fly directly against that - so far we have:
Launchpad / Upstart / Unity / Mir
Launchpad is free-software by name only, and Canonical actively discourage you to setup your own instance.
Upstart hasn't been widely adopted outside of Ubuntu, and has been replaced by the technically superior systemd.
Unity has been extremely unpopular from a user-experience point of view, and now we have Mir. So past history isn't filling me with confidence. Their philosophy seems to be "patch first, ask questions later".
I'm amazed how Canonical has the resources to keep branching out so much while producing a distribution every 6 months. I was under the impression they weren't yet making a profit.
Upstart is a bad example. pacman is a package manager only used by Arch Linux and nowhere else. Should Arch abandon it because it hasn't seen wide use yet? Better yet, there are plenty of package managers that are better than both aptitude and pacman (http://nixos.org/). Should we abandon the technically inferior solutions for the technically superior ones?
Unity is the same way: it is open source. If you don't like it, fork it and fix it. Or use one of the alternatives. This is not Windows or OS X. Vote with your feet and move over to Gnome 3, Xfce, LXDE, or another DE. If enough people do, Canonical will see the effects and stop putting effort into Unity.
IMHO, Linux and its surrounding ecosystem has a long history of everyone trying to pull it into their own direction. Companies are often forced to cooperate when they don't have another choice, but if they did, they would push their own ideas of what the infrastructure should be on everyone else.
But I would personally much rather see pacman (and every other distro-locked package distribution format) dropped where everyone adopted debs or rpms.
What I would like the most, though, is that the most logically technically advanced and easy to use package manager win, and everyone just adopt that. I know that pacman makes building package builds insanely easy, compared to debs or rpms.
> Upstart is a bad example. pacman is a package manager only used by Arch Linux and nowhere else. Should Arch abandon it because it hasn't seen wide use yet? Better yet, there are plenty of package managers that are better than both aptitude and pacman (http://nixos.org/). Should we abandon the technically inferior solutions for the technically superior ones?
A couple of points:
- one example does not plenty make
- Nix has a really interesting technical design. However, it trades the ability to patch security holes across your entire system for the ability to install multiple packages side by side and have atomic upgrades.
More importantly (I was lying about the two points), the package manager is tied intimately to the guts of the distro, it's much less shocking than having Ubuntu implement Unity.
Now, we'll see what becomes of Mir. Whatever mindshare Upstart had went straight to systemd as soon as it came out. If Mir is primarily an expression of hubris, I expect it will follow Upstart in obscurity. If Canonical manage to convince enough developers in the community, they may get something going.
"better" is generally subjective. there's a reason why nixos, upstart, etc, don't catch on.
For pacman vs nix, well, pacman is extremely simple. Extremely reliable.
Nix has some design advantages, but that don't translate all that well in the practical world right now. Maybe in the future.
Upstart was used because it was better than traditional init, and was used by Fedora as well. Then again, Linux could have had a much better init system many years ago, but the GPL zealots wouldn't hear of having launchd, and its less restrictive license, be used.
To clarify, my comment was that even before relicensing, launchd was under a less restrictive license than the GPL -- the APSL allows for binary linking -- but had clauses that didn't allow people to GPL their code. By the time launchd was relicensed, ubuntu had already released upstart, and another init replacement so quickly would have alienated people.
I don't see the problem in wanting to keep the guts of the OS forced-free by license. If any of the deepest components of the Linux stack were MIT or BSD licensed, it wouldn't take long for some business to fork it into a proprietary blob they push for market adoption and you lose the freedom deep in the OS.
With something like init, I would have to disagree. The biggest problem with this decision is that they ended up with an inferior project that wasn't nearly as flexible as the open options. While there are certain areas where keeping things under a restrictive license like the GPL may be useful, the init system is one where I can't see of any way how a closed system would offer any sort of competitive advantage.
> Unity has been extremely unpopular from a user-experience point of view
Has it? Do you have numbers?
Because I see a small number of people who complain very loudly, and often make sweeping claims about how unpopular it is without any evidence, who always get followed by responses from people (like me) who love it...
upstart is used by the current release of RHEL, but it's true that Red Hat has signaled that systemd will come to RHEL as well (Fedora now comes with systemd as far as I know).
I don't see the problem with then using whatever they want, if it provides a good solution, when upstart was developed systemd wasn't even a viable alternative, for me it still is not if stability is what you want and run on a non-desktop system.
Exploring Android internals I much SurfaceFlinger than X in my understanding of its architecture.
> the Layman discovers that Linux can possibly be just as shiny as Mac OS.
Why do we want the layman to be using Linux? To give Valve more Linux customers or something?
We should let them be. Laymen will invariably be better served by Microsoft or Apple, trying to win them over in some sort of misguided drive to "win" market-share seems foolish. Linux should focus on it's niche, and Microsoft and Apple theirs.
This market-share envy makes no sense to me. Does Artic Cat stare with envy at Ford's userbase and expend effort trying to get suburban parents to drive their kids to school on snowmobiles? That would just be silly.
Practically: I want to be using a system that I can be confident has good hardware & software support, not always double checking everything I buy to make sure it will work. That means getting at least a few % market share so that other companies take it seriously.
There's also an ideological angle: many Linux users feel that there's something wrong with a world where only the technically adept can use a free, open source operating system.
> not always double checking everything I buy to make sure it will work.
I feel like we've been there since... 2007? My last several purchases were made with zero caution or research anyway, maybe I've just been getting lucky.
> many Linux users feel that there's something wrong with a world where only the technically adept can use a free, open source operating system.
That's fine and all, I don't buy into that ideology but it is fine if others do. Perhaps someone should warn these unsuspecting masses that "Linux for your technologically illiterate grandmother" advocates are not as pragmatic as they may claim to be. I feel like these people are dressing Linux up as something that it is not in order to sell it to people who, were it presented honestly, would not pay it a second glance.
The hardware situation has got much better, but there are still some problems. Laptops with both integrated and separate GPUs aren't very well supported (so I hear, I don't have one). There's a relatively new scanner in my office that's not supported by SANE. I wanted an external wifi adapter for a desktop recently, and I'm not sure how many will just work with Linux.
On the software side, Steam got ported recently, but there's plenty of other software that we'd like to see. Flash is no longer released for Linux, although for now at least I can still watch iPlayer with the last version of Flash that was released. Hopefully HTML5 will have killed it before much longer.
Of course we shouldn't pretend that Linux is something that it isn't. If it's not good enough, we aim to improve it. But I honestly think that, if non-technical users got something like Ubuntu 12.04 pre-installed on compatible hardware, it would work perfectly well... except for the expectation, both from users and app developers, that everyone has Windows. Which brings me back to the point about market share.
Even drivers that were previously mainline have broken. So while the Intel 4965 used to work fine on Linux, it no longer does. And that is for a driver in mainline.
This is a fundamental misunderstanding of Ubuntu's raison d'etre, which is to bring Free (libre) software to the world.
Except the rest of the (consumer) world doesn't value libre as a first-class feature. They want the shiny.
The way we get to bring our values to the world is to be commercially relevant. Because otherwise, hardware vendors don't care about you, ISVs ignore you, app developers target other platforms, and so forth.
Market share isn't the end goal. It's a means to the end.
"This is a fundamental misunderstanding of Ubuntu's raison d'etre, which is to bring Free (libre) software to the world."
Here's my marketing campaign, sad as it is, its the best possible spin:
"Hey world, I've got something new for you, its incompatible with everything and doesn't work with anything and can't actually do anything you want, but, hey, its new so it must be better and it does a bunch of stuff that nobody outside the dev community cares about. Oh and we "had" to take some stuff away like network transparency which is awesome, err, I mean ignore the man behind the curtain and someday VNC might work, oh and every dev who's finally gotten used to the existing weird design will have to retrain. Other than that trust me its awesome you better switch now. And by switch, I really mean it since it'll be incompatible with everything else ever written." How could anyone not drop their entire legacy installed base of hardware and software and training and switch, I mean its got transparent animated dancing robots which no one can do without, and its got electrolytes that plants crave!
I'm sorry I just can't do any better given bad news.
Seems dishonest and manipulative to me. Con a bunch of people into adopting something that doesn't really fit their needs, is just shiny, so that you can use them as leverage... that is not something I am interested in.
Meanwhile all of those unskilled users are not without a cost. Free software support infrastructure is not set up to handle that many unskilled users. If Canonical is going to be giving all of those people 3rd party support themselves, that is all well and good, but it seems to me they tend to punt the ball.
The entire reason why desktop Linux has been a failure is exactly because it does not fit the needs of most people. And for that audience, "shiny" is equally important as "featureful".
In the modern marketplace, you can't choose either/or. It has to be both.
Ubuntu wants to do both and additionally add libre.
The strawman fallacy you inferred from my statement has the additional fallacy of incorrect causality.
People are initially attracted to Ubuntu for a variety of reasons, and we hope to keep them because we are fit for purpose, not for whatever ideological reason or niche feature, and certainly not due to any sort of con job.
Desktop Ubuntu has failed to reach that critical mass of users to be taken seriously by industry (our good friends at Valve notwithstanding), and the only logical place to jumpstart the install base is the mobile world.
The mobile market is brutally competitive. Ubuntu will succeed or fail on its own merits, not because of any dishonest manipulation of innocent users.
Has desktop Linux been a failure? By what measure, adoption of people who are not in it's niche market? Would it stop being a "failure" if we got a bunch of laymen using it?
Compare community desktop linux distros with community distros of something more oriented towards the masses... say.. Android.
The Debian Project is a well oiled machine, coherent and consistent. Everything has a well defined process. Technically capable users don't have to sort through piles of crap to get work done.
Cyanogenmod on the other hand is an utter shitfest. Standard operating procedure there is wading through clusterfucked forums looking for undocumented unofficial fixes (in prebuild binaries mind, no source) in threads thousands of posts long filled with idiots saying "hurr durr, I dropped my phone in the toilet, now this patch doesn't work". They cannot handle the size of their technologically illiterate community, and everyone attempting to use their software, technical or otherwise, suffers as a result. (This isn't even touching the issue of shittier hardware support than you would ever expect to see with desktop linux...)
If piles of unwashed masses were really what desktop Linux is "missing", then community maintained Android distributions should, far from being a nightmare, be the promised land. This is plainly not the case.
Show me the desktop Linux equivalent of Android, not Cyanogenmod. It doesn't exist - and you can measure that failure with any number of metrics.
If Ubuntu can achieve that type of success and protect the freedom of users with the GPL (instead of the permissive BSD model for hardware makers) then we will have accomplished something in the world.
You have missed the point. Show me the Android equivalent of Debian.
Show me an Android project, with all of the unskilled users that come with it, that is anywhere near as organized as the Debian Project. Show me that legions of unskilled users have allowed this Android project to achieve hardware compatibility at all comparable to what Debian achieves on the desktop.
Such a project does not exist. I assert that it does not exist in no small part because they have too many unskilled users, and because hardware support does not materialize as soon as you reach some sort of "critical mass" of unskilled users.
Idiot users are toxic; anything that touches them rots. Only corporations that are prepared to completely disregard community involvement are capable of wielding an idiot userbases. Canonical is neither up to that task nor does it even appear to be pretending to be. Why? Probably because the lunatics run the asylum.
Why do we want the layman to be using Linux? To give Valve more Linux customers or something?
Idiot users are toxic; anything that touches them rots.
Wow. The level of arrogance and elitism on display here is breathtaking.
This is why Linux on the desktop is irrelevant. Not the fact that it's incompatible with most mainstream software, not the fact that it's low profile or the fact that most Linux desktops look and function like Windows' retarded younger brother – but because a tiny minority of the otherwise hugely welcoming community set themselves up as some sort of entitled priesthood and actively discourage 'idiot users' from getting involved.
New users, even idiot ones, are good. Yes, they bring problems and stupid questions. Everyone has to start somewhere. But more people involved means more investment and helps challenge entrenched assumptions about how things should work. Making things shinier and more accessible does not equal dumbing things down.
Why shouldn't everyone be able to use an approachable, thoughtfully designed system that 'just works' and yet shares our values of freedom, community and open source?
>but because a tiny minority of the otherwise hugely
>welcoming community set themselves up as some sort of
>entitled priesthood and actively discourage 'idiot users'
>from getting involved.
I've spoken to many people over the years and looked at why they use Windows rather than Linux. The reasons, in order from most common are:
1. Lack of the applications they need; nearly always Outlook and Excel
2. Familiarity with Windows that they feel they've invested a lot of time in
3. Simply didn't realise there even was an alternative
Out of all the people I've talked to, there have been a grand total of ZERO that have ever given "Had a bad experience with a member of the Linux community" as a reason. None. Ever.
So, I'm more than a little dubious about your claim that this is in fact the main reason. Got any figures to back up that little theory of yours?
> Making things shinier and more accessible does not equal dumbing things down.
Except in practice, it almost always seems to. Which is why I don't and can't do any significant development, or work of nearly any kind for that matter, on Android, or an iPad.
> Which is why I don't and can't do any significant development, or work of nearly any kind for that matter, on Android, or an iPad.
That's about picking the right tool for the job, not dumbing down.
It's like complaining that a trowel is a dumbed down shovelling device because it can't make any significant headway with digging a massive trench.
That said, I develop Drupal sites locally on my jailbroken iPad and it's pretty nifty to have access to a full development environment on something that portable.
Arrogance or not, the fact remains that community developed Android distributions have all the unskilled users Ubuntu could ever dream of having, yet the projects are even worse.
Projects that rely on community support simply cannot handle the load. It is a failing of community driven open source to be sure, it would be great if it were otherwise, but I simply see absolutely no reason to think it is not reality. Your theory is nice, and it would be nice if it were reality, but it does not fit the data.
Show me the desktop Linux equivalent of Android, not Cyanogenmod. It doesn't exist - and you can measure that failure with any number of metrics.
Sounds like what you're really saying is "you can come up with any metrics you want, then define that as failure to justify my position". There's really no objective basis for saying "desktop Linux is a failure", especially when there's no objective standard by which we could say "desktop Linux is a success" either.
Like somebody else just said "market share? Who cares?" Market share is a means to an end, not it's own end. Desktop Linux works reasonably well for a certain population of people with a certain set of values. Does anybody expect that desktop Linux should gain Microsoft Windows like ubiquity?
It's an incredibly useful tool, with a lot of very smart engineers contributing awesome stuff to it. It's a development platform second to none and far more capable than any other OS I know of.
I don't think it's conning them into something they don't need. I think it's giving them what they want (shiny, easy to use) so that they have something that they need but don't realize it yet (libre).
I agree. But then the great thing about Linux is we don't have to use the layman distribution.
If someone wants to put the time and effort into making something that is great for my grandparents but awful for me then more power to them.
Meanwhile the rest of us can stick to the distros that are suited to our needs. I think this state of affairs is much better than any one operating system ruling. So long as we have binary compatibility among the Linuxen then I'm happy.
Who are we? I only see a Company and a Foundation geared to serve a platform namely Ubuntu and not Linux.
Yes Ubuntu is based on a FOSS ecosystem but it does not mean it shares the same goals as linux, whatever the goals of linux are.
It means its backed by a company which in my eyes want to bring the wonders of a second unix based operating to the laymen. Only in this one everything is open, and that for me makes sense.
It could help make the software market more competitive.
We can already see that Apple and MS want a lot of power over their platforms to the point of being able to decide what software will run.
A more neutral platform in wide use could make the playing field more level thus leading to better software in the long run.
Wait until the only way to run FF / Chrome / 3-d accelerated video driver / vi / emacs / something else thats "critical" is to switch to Mir. After all its free, all you have to do is abandon everything else. What could go wrong?
Simply don't use it is like telling people to take up a new hobby once theirs has been destroyed. But I don't want to take up a new hobby and I like this one. Sorry, no, someone else has decided to destroy it, so don't complain or try to work around the damage just select a new hobby. Hey, you look like a technical type of person, sorry about the death of free software and all that, but I bet you'd like Ham Radio or maybe advanced model railroading? ...
If a free software program decided to stop supporting X and Wayland, it would be because no one cared enough to contribute the code for continued support, which would imply that very few people were still choosing to use X or Wayland over Mir. If Mir really ends up being that wonderful, why shouldn't it win?
How does one free software program replacing another constitute the death of free software?
vi or emacs? Not going to happen, these are among the most portable software that exists.
About Chrome or FF that's only depends on the developers who use the product, I doubt that Google will ever target Mir, actually I believe that the GTK+ backend could target Mir and then boom, all GTK+ applications will then run on Mir, including Chrome. If Google ever comes to target a system with their browser this will probably be Chrome OS.
Firefox is another beast, but unless there's a decision to abandon Linux altogether there'll be devs working on it, or on iceweasel, it's Debian cousin, and this will probably run/compile in every Linux (and BSDs?) out there.
About 3D this is a shit area, I believe this move signals that Canonical will try to sell ARM computers in the future, there's a know lack of free drivers for GPUs in the ARM world. To see what I mean you'll only have to search for the Raspberry Pi "open source" GPU drivers debacle to see how things are going. This is not a criticism of Broadcom who has it's trade secrets to protect, but there the situation is far from perfect and always will be.
If Linux idiots weren't senselessly married to terrible in-group traditions and the status quo, it wouldn't have taken 20 years (or Canonical) to replace the clusterfuck that is X.
I use Linux and I don't even use a graphical environment most of the time because all it does is slow me down. (When I do use one, it's Ratpoison.) The most effective Linux developers I know work similarly. I believe this is a large cause of us being "senselessly married to terrible in-group traditions and the status quo".
I don't know if you're calling all Linux users idiots or what, but I consider Linux idiots as the people who use Ubuntu and Unity and don't have any idea whats going on underneath. They obviously aren't going to be doing any replacing of X.
TD;LR Linux is not one thing. Various projects are not mutually exclusive
We have GNU/Linux with a huge array of applications all licensed in such a way as to allow forking and customisation.
We have different groups of users of desktop/laptops; sysadmins; developers; scientific users; people who use administered desktops at work (e.g. French police &c)
We have a smallish and very pushy company in the UK (Canonical) who want to build on GNU/Linux to produce an operating system (Ubuntu) that will work on TVs, phones, tablets, fridges &c. Their target audience won't be developing alternatives to X, and probably won't be writing bash scripts or anything. (That isn't to say that some their users won't be doing those things).
We have a much larger company in the US (Redhat) who provide a stable conservative desktop as a compliment to their commercially supported server OS (RHEL). They part fund a desktop system that has changed radically recently (Gnome).
There is a German based company that make an enterprise desktop OS and server OS and who contribute to the development of a different desktop environment (KDE)
We have a not-for-profit foundation type entity who push out a version of GNU/Linux that can run on a huge range of architectures (Debian)
There is this thing called 'upstream'. Small projects push out all kinds of applications including alternative desktop environments.
So Canonical does its thing, some of the stuff they do may get back into the mainstream GNU/Linux, some may not. Some hardware manufacturer may or may not take up the system. I may be able to dock my phone to a keyboard/mouse/monitor and write documents in LibreOffice and edit podcasts/photos in a couple of years or 5. (bring that on).
The others carry on as they do now. You may have to choose the graphics card more carefully in future but I hope not. I suspect workstation class computers with separate monitor/bases will get more expensive and specialised anyway, and I imagine someone will do a 'high end workstation-OS' just for them.
Care to elaborate? Your comment reads as a sentence-long "NO, YOU!" right now. What about Canonical is a problem, and who are they causing the problem for?
As a "Linux idiot" myself, I hear your disapproval of the parent post, but am curious to hear the details that support your opinion (I might be able to learn something from it).
pilgrim689: Why is everybody hating on Canonical for
ditching X11?
sneak: Because they're "Linux idiots" who love X11.
ihsw: They don't love X11, they just hate Canonical.
ihsw is saying that the change isn't getting negative reactions because of the change itself, but because it is being made by Canonical.
I thought everyone was hating on Canonical for ditching X because rather than support the fledgling standard replacement for X, Wayland, they once again (just like with systemd vs upstart, compiz vs mutter/kdm) go off and do their own thing.
And if it turns out like either of those, they end up with an inferior barely maintained product that gets pushed to the sidelines once they show it off on a showroom floor.
Skimming the article, Canonical has some legitimate criticisms of Wayland, and are working on something that doesn't have those problems. Yay, this would be good! Except that it's by Canonical, and as you said, "Canonical never delivers". (That wasn't a quote from you, it was an iteration of "OP (never) delivers")
What I said was that when Canonical does deliver, they let the delivery wither and die without support because they are spread like a teaspoon of mayo thin on their footlong distro sandwich. See: Upstart, Compiz, Software Center.
> It certainly improves certain details of the user experience, when it works correctly, also usually consuming more CPU than flash player
You make a sweeping statement which is not true for every system. Looking at what happens playing a Youtube music video, according to htop:
- pulseaudio is using between 0 and 1% CPU
- plugin-container (wrapping Flash) is at 3%
- firefox oscillates between 20 and 40%
Now, my understanding is that, at the time pulseaudio came out, it was using untested functionalities in audio drivers which, for many chipsets, did not actually work and progressively got fixed. There was the same problem with graphics drivers when KDE 4.0 was introduced. Maybe you're out of luck and actually have a defective driver?
I'm not sure I have a bad driver, the last two chipsets I used were Intel (and no weird audio devices), works fine with other sw. Latest distro tested: RHEL 6 and Fedora 16
For me, PA would stay around 5%, 10% sometimes, and even more depending on the task. Not to mention crashing every week (at a random moment)
I'm not sure if this could affect the CPU usage. I hear that you can play with the resample setting to lower it, but I haven't had to do that on a 5+ yo desktop.
Nonsense. Fedora is a fine desktop Linux... I run it on a woefully underpowered laptop which I use for development, and it is every bit as usable as, say, Ubuntu (which I use on my corporate laptop). In fact, I'd take my Fedora/KDE setup over this Ubuntu/Unity setup any day.
Am I the only person in the world who has never had anything but a pleasant experience with PulseAudio? The first time I noticed that I was using it, I had already been using it for a year...
No, you're not the only one. I'd been forewarned of the awfulness of pulse, but didn't even notice its installation until later, when I discovered all the useful things I could do with it: play music over the office Apple airplay thingy; use two sets of headphones simultaneously; send audio to another computer in the house. Without any typing, searching, or rebooting,
I remember OSS and Alsa, and I don't miss either. As far as I can see, it makes Linux audio work like any other computer in this millennium.
I haven't had any recent trouble with it. Back in the early days, it didn't always work well with SDL, but that was fixed years ago. I've been able to crash the daemon with stress tests, but it seems to restart automatically.
I can't understand all the dislike of PulseAudio either. It certainly has a nicer API than /dev/dsp.
My most recent attempt to use PulseAudio (as it came via Ubuntu) had the quirk of not activating the headphone jack when something was inserted. It dutifully turned off my speakers, but no sound was available elsewhere.
I previously had stock Debian on the machine, and everything was working fine. A half day's effort was put into toying with ALSA configs that Pulse was deferring to for whatever driver choices it made. Dozens of help articles or threads where everyone else in the world could proclaim "Thanks, that fixed it for me!" left me with less functionality than I started with.
The only thing Pulse has ever done right for me was was help to record some loopback audio on an otherwise-crippled sound card.
No, there are a decent chunk of users for whom it works fine. Maybe as many as 95%. But there's also a substantial number of users for whom it introduces major problems - which would be more forgiveable (at least to yours truly) if it provided any visible advantages over existing systems.
(After a few months of trying to make pulse work I left for FreeBSD, which works beautifully and avoids a lot of linux's change (seemingly) for the sake of change)
For me, PA crashes around once a week (and this is with the latest version I tried), or just plain stops working (like muting the sound, garbled sound, etc)
Not to mention the delay that happens while switching songs in mplayer for example. Use ALSA and song switch is instantaneous.
Once every two or three months, when I remember to apply updates.
> - What audio softwares you use the most
Only flash (youtube, primarily. Also youtube's html5 player probably, I haven't paid much attention to that) and mplayer. A few years ago, XMMS2. I used to use an XMMS2 client I wrote for music, now I just use some bash scripts and mplayer; I can't say I've noticed any delay while switching songs with mplayer or XMMS2.
When I play music from my laptop through my raspberry pi connected to my TV with pusleaudio, then I notice an audio delay. That is the only sore spot pulseaudio has ever presented me.
Very interesting. I thought some instability may be due to PA staying up for a long time, but apparently not.
About the delay, try this: mplayer *mp3 (in a directory with mp3 files, of course), then press >. There should be a significant delay in switching songs (like, 0.5s). Make sure mplayer shows it's using the [Pulse] driver
I agree with. The only problems I've ever had with it have turned out to either be problems with the actual hardware or problems with the clients, not problems with Pulse Itself.
I will admit having a lot of a hate for figuring out which audio 'profile' is which...
If you've got the plugs for 7.1 and a normal line in as well as possibly a digital one, they you've probably got 23 choices of profile for your built in audio device. Add in a GPU with HDMI and the NVIDIA driver and you probably get at least another 7+ choices all labeled Nvidia... I find I frequently have to guess multiple times whenever I hook up a TV via HDMI.
But it works quite well if you define works as works without adding on any usability riders.
No issue on a couple of machines. Running with a call to start-pulseaudio-x11 from my .xsession, works fine. I believe running it as a system daemon is not recommended, though.
Not this again... You could have complained about this years ago, but it makes no sense today. PulseAudio got fixed.
> user experience
you can set up PulseAudio for LAN audio streaming with padevchooser (GUI). Could you do this with ESD? Arts? Tell me about this audio streaming solution with super UX for Linux that you know...
Parent seems to have been experimenting this issue recently. Saying "PulseAudio got fixed" does not mean "it works perfectly for everybody" (see https://bugzilla.redhat.com/show_bug.cgi?id=813407 ). Most people will care more about getting a sound system which works without crashing and not sucking up CPU before getting sound system with streaming. The fact that you and me have had a good experience with it does not mean other people are not having issues with it (just as their bad experience may not reflect the overall quality of the software).
That's a Fedora bug. It wouldn't be bleeding-edge (enough!) if it didn't crash. Running Fedora and complaining in a public forum about crashes (without saying the distribution name in the first place!) makes no sense to me.
"you can set up PulseAudio for LAN audio streaming with padevchooser (GUI)."
Yes, and? I understand this is a useful feature for some people, but I'm just interested in listening to music and PA does not me allow to do that without frequent crashes and bugs
I don't care about the features of a car if it stops working at random. And yes, this has happened on every distro I tried since PA started being bundled. Every single distro, every single machine I've had only a stable system with PA OFF
Bugs can be fixed. Pulse is newer than Alsa and more complex, you should expect bugs. If you don't have a problem with the implementation details of PA (software mixing, optional software equalizing, networked audio, stateful audio adaption to devices) then just report the bugs and make it better. Don't whine and try to fragment the Linux space any worse than it already is, especially with Mir's announcement.
The real question is why did you need something newer in the first place ? Are years of bugs, misconfigurations, random crashes worth the minimal improvement end-users will get in the end (when everything finally works for everyone) ?
Yes, pulseaudio has all sorts of awesome esoteric features. You can stream sound transparently from a linux box to a windows one. That's kind of useless if it brings down your computer.
Oh, didn't realize you were asking about the UI specifically. I believe you would just set an environment variable of where you wanted the sound to go.
The part I liked was that you would start ESD and everything just worked.
I conducted an experiment on my last laptop, having noticed PulseAudio taking up more CPU than my media player. I found, quite consistently, that disabling PA led to my media player using more CPU on its own than the sum of the media player and PA when both were running.
I agree with this, I have similar experiences with pulse. I also use the pulse equalizer, which is super easy to set up.
I like it. I think it does things right, and they can optimize it more over time. It is better than having a fast audio subsystem that just can't do what I want. Pulse is a kitchen sink that at least covers its bases.
Basically, Canonical appears not to be publically consulting with the other distros (e.g., RH, Gentoo, Arch) about their changes. If they were, and there was a general consensus of a plan, I would personally not be bothered at all. But they seem to be content to pursue unilateral forking of what Linux means, and IMO that's not ok.
Canonical is a business, I believe they do care for Linux because otherwise I don't see why they will be part of the Linux Foundation. But they are a business.
And like most businesses they need to make unilateral decisions. The only difference is that Canonical makes these decision public knowledge while it is trying to achieve a hectic roadmap, for business purposes.
Ubuntu has been a Canonical project that is based on Linux for so many years. It is hardly community driven OS that people believe. I think it is a good thing. They are moving forward in some direction. They will make mistakes. Hopefully they will learn out of it. There is nothing wrong in it. In general Linux userspace is very fragmented. Organizations and developers seldom agree on anything. Ubuntu can't provide a true desktop user experience in such a space. And there are always distributions like Arch Linux which offer true freedom and completely stick to Linux specs.
The Layman doesn't want a desktop at all; they want a smartphone or a tablet. Want to drive mass adoption of Linux? Write a killer app for Android.
The folks who do want desktops, want those desktops to run software like MS Office or Adobe CS. If Linux can't do that, it does not matter how shiny it is.
You are sacrificing your objectivity, there is no reason other distros cannot be 'just as shiny' as Ubuntu or Mac OS.
Furthermore it is horrible because Canonical has a history of secrecy and authoritarianism and they absolutely go against the spirit of open source software development.
If the "spirit of open-source software development" can't stand for distro maintainers to have strong control over what goes in their own distros, this open-source spirit sounds pretty authoritarian itself. If (for example) I want to maintain my own window manager for my distro and I provide the source under a free license but don't accept outside contributions into my own distribution, what is wrong with that? Isn't that part of the freedom of free software?
In my experience distros attempting to keep things as close as possible to packages upstream (except for bug fixes) are always the most useable, stable, etc. So I sort of prefer the "authoritarianism" from upstream.
Additionally, an upstream package has authority only on ONE component. The distro has authority on ALL components. That makes ALL the difference.
I actually agree. I like Debian better than Ubuntu because I feel like its packages are generally more reliable for my use. But that doesn't mean I would declare Ubuntu to be "horrible" — just not to my taste. I respect Ubuntu's ambition and hope they are successful in what they are trying to accomplish.
especially since people have been slamming canonical for years for being mere repackagers who were not contributing any significant code to the ecosystem.
An obvious clarification first: Wayland is a protocol definition that defines how a client application should talk to a compositor component. It touches areas like surface creation/destruction, graphics buffer allocation/management, input event handling and a rough prototype for the integration of shell components. However, our evaluation of the protocol definition revealed that the Wayland protocol suffers from multiple problems, including:
The input event handling partly recreates the X semantics and is thus likely to expose similar problems to the ones we described in the introductory section.
The shell integration parts of the protocol are considered privileged from our perspective and we'd rather avoid having any sort of shell behavior defined in the protocol.
However, we still think that Wayland's attempt at standardizing the communication between clients and the display server component is very sensible and useful, but it didn't fit our requirements and we decided to go for the following architecture w.r.t. to protocol-integration:
* A protocol-agnostic inner core that is extremely well-defined, well-tested and portable.
* An outer-shell together with a frontend-firewall that allow us to port our display server to arbitrary graphics stacks and bind it to multiple protocols.
In summary, we have not chosen Wayland/Weston as our basis for delivering a next-generation user experience as it does not fulfill our requirements completely. More to this, with our protocol- and platform-agnostic approach, we can make sure that we reach our goal of a consistent and beautiful user experience across platforms and device form factors. However, Wayland support could be added either by providing a Wayland-specific frontend implementation for our display server or by providing a client-side implementation of libwayland that ultimately talks to Mir."
>This is the worst path Canonical could have possibly chosen. Now developers across all different toolkits and applications, from Gtk+ to Wine, will need to maintain massive patchsets to integrate with Ubuntu. Either that or run in a rootless X window in "legacy" mode.
>This will not end well for interoperability, for developers, or for the wider Linux ecosystem. Bad times.
That comment on OMG! Ubuntu! is making me uncomfortable with the consequences this might have for the GNU/Linux ecosystem. I am not sure about how far I can trust that comment, but the Phoronix article[1] explains it better:
>Canonical developers will make to see that applications relying upon Qt/QML, GTK3, XUL, etc will be able to use Mir in an "out of the box" manner. The legacy X support will come from an in-session root-less X Server.
I would, though, like to know more about the consequences for this. First of all, we know that Unity will be much faster, since this thing needs to run on phones too and Unity will actually be a "real thing" instead of just a Window Manager (Compiz) plugin.
>Isn't a point of FOSS that people can all contribute to one major project instead of reinventing the wheel?
That is another comment on OMG! Ubuntu! and I completely agree with it. Sure, freedom of choice is great, but why not use Wayland, really? It was designed from scratch to work for everybody.
I'm personally not too worried here. The thing is both Wayland and Mir will be able to run X on top of them, so currently all available GUI programs will still work.
What matters is the "winner". They will both hit mainstream usage, we will see which one is easier to develop for, and that one will take off. If Mir's claims of fixing input / specialization issues in Wayland comes to fruition, then it will probably win. If Mir hits like Unity, or atrophies like Upstart, then Wayland will probably win.
The problem is his Wayland fails everyone can switch to Mir. If Mir proves weaker, we are stuck with a more fragmented desktop space because Canonical doesn't change their minds on these things.
This is awesome! I really can't wait. Having been dealing with the drm/kms stuff to try to build applications "near the metal" on small devices has been painful painful painful. Just too many pots, each with their own sous chef. Someone to put some structure around that and get the GPU folks in line makes so much sense.
I have this question as well. Especially since they have already done so much work around Android for Ubuntu touch, why not simply build, what will essentially be desktop Android with a Unity Shell.
I notice that rasterization to non-display devices (e.g. printers) isn't mentioned in the proposal at all. This was a serious weakness of X11 and I'm surprised it's not discussed. In mainstream consumer OSes, such capability is part of the basic graphics toolkits (GDI, Carbon).
If Canonical is serious about attracting mainstream Linux adoption, this is going to have to be addressed from the start.
Remoting over a network with latency is not even an afterthought; none of those words appear in this spec at all. I'm worried that a new display system might start getting traction in the industry while assuming there's only one computer in the world I care about and that I'm sitting in front of it, because that would be a huge step backwards.
On the one hand, this looks like one good way to get rid of the massive bag of hurt that is X; on the other, seriously Canonical? Re-inventing another huge chunk of the stack just because? NIH syndrome much?
X has two drawing APIs. One of these is a part of the core X11 protocol, which is ancient, useless, and nobody uses. The other is the XRender extension which provides modern composite operations, among other things such as gradients. This is what Cairo, for example, uses. X also has font drawing APIs.
The X11 protocol (without extensions) defines about 120 types of requests: create a window, move a window, etc.
Nowadays, there's at least 25 % of them which are useless: usage of server-side font, or the drawing of squares and polygon, are unused by any modern application or toolkit. All of this is superseded by requests from extensions, like the XRender one.
The handling of multiple monitors displays has totally been screwed up. X11 has been designed to work in Zaphod mode (independent monitors). But Xinerama, and nowadays XRandR have replaced it up: recent X servers (released after ~2007) does not support Zaphod mode anymore, even if it's a core piece of the X11 protocol.
Long story short: the X11 protocol (which Xorg/X is an implementation of) was created in 1980, and was made for drawing stuff like this: http://xawm.sourceforge.net/Xaw.gif
The important distinction between X and any modern display server is that nowadays we separate compositor (that which composites application content into what you see on the display) and toolkit (that which draws text boxes, buttons, progress bars etc.) into two. X was designed to do both these things. This isn't practical for several reasons. So far people have worked around this issue, but many feel it's time to stop working around it (since the X codebase is a complete mess due to this "working around"), and develop something from scratch that does what we want for a modern desktop in 2013.
It draws process boundaries in the wrong place then forces enormous complexity trying to deal with the resulting synchronization problems. To be fair, when X was designed, I doubt it ever occurred to anyone that one day you'd have opaque resize, much less a fully double-buffered dance between application, X server, and window manager.
Besides of being too slow to run on current phones, and not natively supporting 3D acceleration, these are the reasons mentioned in the article:
"With respect to shell development (Unity), three major shortcomings of the X stack prevent us from delivering the user experience (f’n’f) we have in mind:
* X shares a lot of system state across process boundaries. This is obviously not a problem in itself but a system-level UI that is meant to provide a beautiful and consistent user experience is likely to require tight control over the overall system state.
* X's input model is complex and allows applications to spoof on input events they do not own. On the one hand, this raises serious security concerns, especially regarding mobile platforms. On the other hand, adjusting and extending X's input model is difficult and supporting features like input event batching and compression, motion event prediction together with associated power-saving strategies or flexible synchronization schemes for aligning input event delivery and rendering operations is (too) complex.
* The compositor hierarchy ends on the session level, and no tight integration into the system from boot time onward is available. For that reason, there is a visible glitch when transitioning the system from a VT-level to the graphical shell level."
Also, it says something when #3 on your big list of complaints about X is that switching graphics modes when starting X is a major problem that needs to be fixed.
It was designed for technologies that are obsolete now, forcing enormous complexity for trivial tasks.
(Disclaimer: I had to work with low level X years ago, so I disdain and hate it with passion).
Today almost everything uses OpenGL acceleration that provides a standard way of drawing things on screen , while being fast and unloading and abstracting work from the CPU.
OpenGL is in itself a meritocracy on drawing standards, companies innovate by extensions and what works in the real world is selected into the API, instead of the complex "by committee" of people that know nothing about drawing design of X.
Just wanted to mention that the news of Mir has hit wayland's mailing list[1]. I'm very curious to see what the wayland developers think of all this.
A lot of work has already went into wayland and in making things work with it (gtk3, qt5, clutter etc). This is truly an ambitious project and I doubt that ubuntu engineers would needlessly want to write all of this from scratch if there weren't legitimate shortcomings in wayland's architecture.
Personally, I'm looking forward to my wayland powered, fedora 20 desktop running the yet to be invented WMonad tiling window compositor.
I got the same feeling. They decided to address their decision not to base off of Wayland, but all I read was a convincing reason to fork off of what Wayland is doing. I wish they would have given the real reason for their decision, right now it just smells like NIH.
I don't hate this decision. I probably won't upgrade as soon as it comes out, but in my opinion linux ecosystem needs competition in the display server segment.
If they screw it up there are plenty distros to choose from.
If this is going to have any real hope of replacing X it also needs to be licensed as liberally as X is, otherwise, it's doomed in many commercial sectors. (Current mir license appears to be GPLv3.)
This is Ubuntu's "Bada" move - it is scared of being assimilated by Android that it is forgetting that it can truly innovate in the UX and not by building walled gardens around display managers.
Ubuntu already has significant investment in Surfaceflinger/Android via its Touch vertical. It has also started migrating to QT/QML for its shell (which work really well on Android). There is a significant opportunity to innovate on UX (like Blackberry Z10), rather than throw away the ecosystem that would come with adopting an Android core.
Yesterday, I couldnt join a GoToMeeting using Ubuntu. But just after, I did it in less than 2 minutes by using an app oumy Android device. That is a huge ecosystem, that I want to use on my desktop (I dont know how the desktop UX for a touch app will work, but I hope it will).
I say that Ubuntu, Google and Valve should sit together and come up with a graphics+sound backend that will work together. And let me play Half Life 3 on my phone and desktop simultaneously !
Given my bad experience during the past years of Ubuntu breaking things in Debian and inventing poor software by themselves, my estimation is that they're not smart enough to pull this off.
I agree they do not have the personnel to put out a major software component like this themselves. Just go to Launchpad and read the bug reports for their forks. Here is a typical nux bug report:
So Canonical decides to go off the beaten path and create nux. Dozens of Ubuntu users indicate a problem. Indicating at least hundreds of people probably see this problem?
Resolution? Nothing for 10 months. Then "Gee, our nux developers have been at work on a cool new version instead of fixing bugs, why not upgrade to that and try it out".
With Ubuntu upstream libraries from Gnome, freedesktop.org, Debian etc., you have a confluence of people from different distros who can work on bugs that come up. With these Canonical forks, realistically only Canonical employees are going to fix 95% of their forks. But they're too busy working on the next cool thing to fix bugs.
My Ubuntu 12.04 LTS desktop currently has hundreds of zombie lightdm processes running on it. I reboot but they come up again. This is not specifically a Canonical fork problem, but is an annoying bug and I and many Ubuntu users have come across many of these bugs. I can imagine what Ubuntu will look like as it forks more and more functionality, and less and less help for bugs can gain help from Gnome and fd.o.
mdadm and associated init scripts was the dealbreaker for me
they broke the initscripts so array assembly didn't work correctly - not even the same as the installed man page stated
they had mdadm version 2.x (has significant flaws) for YEARS after mdadm 3.x was accepted into debian - YEARS - even after multiple users making clear what the problem was in bug reports
a user even made a patch and a ppa which fixed things but still they didn't pick up on that for a year + as i recall
and they broke/removed debian's PERFECT support for installation onto md raid
other stuff i've honestly forgotten but i swear there was more
oh! oh! wait, also the STOPPING THE BOOT PROCESS TO WAIT FOR YOU TO PRESS A KEY if a drive needed an fsck!!!!! what the f....
(don't understand why you were downvoted, good question)
in the past when ubuntu was still "linux for human beings" they did great job polishing the linux and providing good linux experience. They were very good improvers.
but then they decided to became inventors. But they don't have strong engineering background and their products were trash.
upstart never provided advanced parallelism and was surpassed by systemd
Top menu and indicators rely on d-bus -- really stupid idea and misuse of technology
they abandon mutter+clutter for closed gl-canvas rendering library + compiz to be used in unity. Now is mutter+clutter far more advanced.
Now they want to change wayland for mir? are they serious? They are not good at inventing things. They will just make linux fragmentation much worst. It's really problematic to make good drivers and gpu companies can't spend money and people on different linux platforms.
ubuntu became from "linux for human beings" to "crappy mac-os like for poor"
Also - kind of a noob question but: my assumption is all the recent ubuntu controversy is over ubuntu desktop. Does this affect the ubuntu server distro at all? Especially since I am trying to learn how to create a well provisioned ubuntu server vm for use with linode deployments...
While I agree with most of you who says they are worried by this development, I understand and welcome this direction Ubuntu is taking, almost making their own, not flavor but unique OS. Which is what people said HP should have done.
It is frustrating being open source advocate, if software you use is somewhat inferior to what is available on other platforms, just try to use firefox on ubuntu and compare that to osx or windows. If Ubuntu manage to pull this off, and I think they deserve all our support in that, we will get inspiration for all the open source projects as well as good codebase they can fork and work with. I am not expert in licencing, but anything they accomplish can't be bad for the open source movement.
"It is frustrating being open source advocate, if software you use is somewhat inferior to what is available on other platforms, just try to use firefox on ubuntu and compare that to osx or windows."
What do you find wrong with Firefox on Ubuntu exactly? I'm not challenging just interested. I use Win7/IE9 and Ubuntu1204/Firefox 19 most days.
Pages are loading more slowly, you can notice lagging of some actions. As a developer I am used to going fast through things, so when things are slow, it is noticable. Maybe it is due to pluggins, I don't know, I just know it is surprisingly slower.
Under every other circumstance, I would prefer linux/ubuntu to any other machine. I was using linux for several years as primary dev machine. At the moment I am on a mac, even though I have both linux and win machines as well.
" things they claim wayland/weston input can't be extended to support:
"... adjusting and extending X's input model is difficult and supporting features like input event batching and compression, motion event prediction together with associated power-saving strategies or flexible synchronization schemes for aligning input event delivery and rendering operations is (too) complex."
is already implemented and working in weston today..."
Those who are complaining that Ubuntu is diverging from most other Linux distros forget that this is the nature of open source software and competition.
Part of the success of open source software is that it's highly evolutionary. Good successful projects attract a following and get better. For this to happen, you have to have choice and diversity. There has to be competing flavors, libraries, and distros all vying for market share.
What Ubuntu is doing is great for us all. Providing more alternatives in the Linux ecosystem than just relying on X/Wayland/Whatever.
Ubuntu seems to have their reasons not to want to use Wayland. Maybe Mir is awesome. Then I can move to it. If it's not so awesome then like Unity and Upstart I can ignore it.
Seems to me the enemy here is the old fashioned clunky X server. Good. Two armies fighting the same target at least eliminates that target before they start fighting each other.
Besides, if they both end up having support in all the major toolkits and they both have an X server fallback, then we should be able to meet in the middle somewhere eventually.
I'm much the opposite: I keep giving Ubuntu a shot, only to find myself disappointed by GNOME, Unity, and the other usual suspects of the DE world. With no need for Unity & Friends, I just use stock Debian or Crunchbang.
I'm rooting for Canonical to do well with this so that I might finally be able to switch to using Ubuntu in order to get a well-supported, well-integrated, targeted-by-most, "Fast and Fluid" Desktop Environment. Up until now I still hide away in a corner with my annoying-to-maintain (but certainly "Fast and Fluid") Openbox environment.
I do agree with you on the well-supported and well-integrated part. Ubuntu has the polish and ease of use of OS X with the great hardware support of (nearly) Windows, with the solidity of a steadfast GNU/Linux foundation. But whenever I hear about a new development in the Canonical world, it's always about something that appeases consumers rather than power users (which Ubuntu has been targeted at). For developers who constantly need to dive down into the OS, something like Sabayon or Fedora seems to fit better with their endless modularity. Besides, it would be nice to try a non-Debian distribution for a change.
As a power user, I don't really feel I have particular unfilled needs in any Linux environment. Most of what I need, development-wise, I already have. All that requires is a shell and the ability to configure my apt sources. That said, I don't do much OS-level anything. For that, you will probably always need a distro that assumes less about your graphical environment.
What I do miss from other OSes/environments is that sense of a tighter integration. I spent many years not missing it, having originally gotten into GNU/Linux by cobbling together my own crummy environment in Gentoo (on which Sabayon is based).
It wasn't really until I first tried Windows 7 a few years back that I realized that if I actually wanted to use the latest this-or-that application in a seamless (dare I say enjoyable?) manner I needed to stop clinging to my obscure WMs.
And yet, to this day, I'm actually still clinging.
+1; this was my thought as well. The problem, though, is that I can no longer say to people, "you should try Linux! Ubuntu is predictable, stable, and easy to use."
I don't know if Debian has a similarly friendly desktop distribution, but unless they do, I'll now point people to Linux Mint. Granted... I think this has been a better choice since Unity came into the big picture, not because it's bad, but because it's unfamiliar.
"I don't know if Debian has a similarly friendly desktop distribution"
Debian Wheezy (currently testing but which will be the stable release soonish) provides Gnome 3.4 desktop but with the options of xfce4, KDE and lxde. A recent install using defaults onto a low spec netbook was basically easy. See
Thanks; I'm aware of Debian's release cycle, but am not entirely sure how "non-Linux-person-friendly" the stable release is. Which is my own fault, since I've settled for Ubuntu for so long.
It's great to hear it installs well on netbooks! I'm typing this on an Ubuntu Aspire One, though I always use OpenBox to replace whatever Ubuntu comes with since I'm comfortable with it. Will definitely check out Debian stable as an alternative (which is a bit funny coming from Ubuntu... an alternative to an alternative).
I'm a 'non-windows-admin' user and recently had to install Win7 and drivers on a thinkpad T60 for my sister as my usual CentOS/Ubuntu/Debian wasn't what she wanted. Took me a bit of time and some googling.
I think anyone that can reinstall Windows with drivers could manage a Debian install provided they check the wifi card (available firmware) and graphics drivers (not Optimus basically) first.
Gnome 3.4 with Gnome Shell is perhaps a bit heavy for an Atom based netbook although I find it usable.
Debian testing is pretty great. I personally use a mix of testing, unstable, and experimental on my desktop, and a mix of stable and testing on my server.
If it's fresh packages you want, surely you know there's more than just stable. Have you given Sid a try? I've been running it for years on my non-server machines, and I haven't had any more or less problems with it than other bleeding edge distros.
I have ran unstable for 10+ years. The only problems I have ever had were graphics related and required downgrading to an older version of an X11 library. Debian unstable has never put me in a position where I could have lost data.
The other possibility - I just saw an article on phoronix noting the adoption of the Android window server layer (SurfaceFlinger or something like that) on ubuntu mobile. Not familiar at all with android underpinnings, but would that be another alternative to Mir? i.e. more modern window server, actively developed by Google dollars, etc?
SurfaceFlinger is missing a lot of essential desktop display protocol components, like native windowing, chrome allocation, it only uses openGL ES, and isn't environment agnostic at all.
Also, completely opinionated, but after reading some of the SF code in the AOSP, it seems like just as much of a mess as X is implementation wise. It is all over the place.
No one has made the obvious comparison in the comments to ipv4 vs ipv6? The main issue with "yet another graphics protocol" is islanding. Right now all my "linux" boxes are more or less intercompatible. I can ssh -X whatever mythtv-setup and get the config GUI from a server that doesn't even have a monitor (not even sure if it has a graphics card?)
Now I / we will inevitably have three little not too compatible islands of X, wayland, mir, who knows.
My primary interest as an end user, because the machines "I do stuff on" are multiple huge headless servers and virtual images running on them, is network transparency. As long as a Mir keyboard/mouse/monitor can connect to a "real" system and give my my X when necessary, in a VNC like window or whatever if necessary, I'll be OK.
Connectivity demands go both ways. If my refrigerator ends up running Ubuntu for its user interface, I'd really like to be able to remotely connect to it to mess with it.
What comparison to IPv4 vs IPv6? There are very few parallels...
I guess you could say that X and Wayland are like IPv4 and IPv6, where we strive for total migration to the new, better protocol, but in the meantime you can have a dual-stack setup with both that works fine. But it's not like anyone is coming in with a competitor to IPv6, so the analogy completely breaks down...
Relevant: https://lwn.net/Articles/524606/, OpenBSD complaining about modern X becoming less and less portable. Ubuntu, of course, is always willing to turn the incompatibility to 11.
What does this mean for Ubuntu-based distributions that offer a more traditional look and feel, like Linux Mint? I jumped ship for Linux Mint when Ubuntu shoved Unity down everyone's throats.
I like the balance Ubuntu strikes between good hardware support, recent packages, proprietary graphics support, and the "it just works" factor.
I just want all of that without the Unity mess. Unity might be good on a cell phone, but it was really crashy and impossible to do any work, because every time I've attempted to use Unity, I couldn't figure out how to do the simplest things with the GUI, and ended up switching within days.
It sounds like the new window manager will only support Unity.
If it's better, Then they have improved the ecosystem, and the developers and users will follow. If it's worse, then we can hope they will switch back, or they will lose users/developers.
The concern is a mixed bag... splitting development on something that may or may not ultimately win.
But mixed bags lead to innovation more than having one true way. Ideas from Mir and Wayland and X can lead to all three of them improving more then if only one or two of them were viable. However the lack of focus may slow individual development.
TL/DR Worst case is still not horrible, so long as Canonical recognizes it.
SAR - We need fast driver development using safe and automatic "Template-Based" build systems.
There is a lot GNU/Linux can learn from Embedded Systems development.
Wow. The sheer level of FUD and negativity that you can find on these comments is just mind blowing. It seems that most of it (with a few knowledgeable exceptions) is coming from folks who don't know what they are talking about and have never probably developed graphical applications for Linux. They can't elaborate a single technical reason as to why they dislike whatever it is they dislike, yet they still jump on the bashing theme du jour (which nowadays seems to be Canonical).
Your description sounds a lot like the bashing of X itself.
As someone who is happy with an X desktop I don't entirely get it. I understand that some of the folks doing Wayland have experience hacking on X but it seems a lot like wheel reinvention. X may have some warts but it's hard to argue that it's totally broken; it's also come a long way in the last 10 years or so, I'd say in testament to its ability to be extended. I think innovation in the toolkits (to catch up with where X is today) sounds like a better deal.
It it going to be compatible with existing window managers for X or does the change mean that Ubuntu will work only with Gnome or window managers developed from scratch for Mir?
Anybody know what this will be licensed under? I would think (L)GPL, but https://launchpad.net/mir isn't clear:
"Licences:
GNU GPL v3, GNU LGPL v3, MIT / X / Expat Licence, Other/Open Source
(Boost Software License - Version 1.0)
Commercial subscription expires 2022-09-24
This project’s licence has not been reviewed."
Application authors relying on Qt/QML, GTK3, XUL etc. should not be required to perform additional porting as we will work on providing Mir integration for the most prominent toolkit choices.
Maybe they've just decided to take Qt first, and that's why there's already a project for this.
I'm a bit confused, I thought GTK+ did its rendering through Cairo, which would mean that a Cairo backend is what's needed. Not 100% sure about this though, I don't track GTK+ at a low level these days.
What I don't get is why Canonical doesn't just implement Unity as a Kwin frontend if they are rewriting it. They obviously know Compiz has no support anymore, and if they are switching the bulk of their projects to qt, it seems like they have a compositor just kind of sitting there waiting to be adopted.
Considering that Plasma (on my install) uses 100MB of RAM to work its wonders (I have no idea what or how), that alone makes me suspect there might be a better way.
The worst thing is using C++ for a GUI engine. Don't do that mistake again!
The code looks quite weak at the first glance, but at least not so ugly as average open-source code.
The architecture solutions seem a bit weak too, even weaker than a GUI engine I have written being a junior developer.
Overall, their code seems to be written by a junior developer.
Seems to be too ambitious so far.
The problem is that it is actually really hard to make a proper GUI engine. It requires enormous experience and amount of expertise, like you can come up with something decent after building a few your own engines and closely analyzing existing solutions.
I tried Mint with Cinnamon a few months ago. I normally use Gnome classic with XMonad as the window manager. Some things that are in Gnome were missing in Cinnamon (bluetooth settings was an example IIRC, but I don't remember the details), and I ended up with a mess of Gnome and Cinnamon, having some apps installed twice (the Cinnamon version and the Gnome version). Other things required some manual work, like integrating Dropbox with Nemo. I ended up going back to Ubuntu.
Maybe this is the kick in the pants Linux needs to increase adoption. But I would much rather know GNU/Linux, not Ubuntu. Now Ubuntu is standing alone with Compiz, Unity, Upstart, Launchpad, and Mir, all pretty fundamental pieces of the core system. In a decade, will switching from Ubuntu to Debian be as big of a culture shock as switching from Windows to Linux?