The complaints about Debian were that they only cut a release like every two years. The Debian response was that people who wanted more timely updates could run the "testing" branch, which was a "rolling" release.
Ubuntu promised to solve the slow release cycles and the instability of a rolling branch by cutting stable and well-composed releases on a rigorous six month schedule.
This announcement sounds a lot like the original Debian model they were trying to get away from.
but even more surprisingly, this list for stable is even longer:
Can anybody explain this?
(Yes, I will take the shame on me and ask this on a debian mailing list, however, maybe anybody has a good explanation here...)
1. Read the vulnerability description.
2. Ask yourself if you are vulnerable.
3. No? Don't worry about it.
4. Yes? External mitigation where possible or patch ourselves and commit back to debian (we a project member on our team).
We've not got to 4 yet, but have committed loads of fixes anyway.
1. aptitude update && aptitude upgrade
2. ask yourself, if your system is vulnerable now
3. feel the good hope and go ahead.
If I had the budget / time to research every single CVE I would probably not trust repositories at all...
do YOU trust repositories?
That's what we do for our internal dev systems.
Ubuntu is probably going to take care of this in their rolling model.
One minor nit: in the section on benefits for Core/MOTU developers it suggests that only 2 releases would need to be supported. I think in reality that would actually be 4 releases, as with a 5 year support commitment for each LTS release there could be up to 3 LTS releases that are still being supported at any point in time. May be just a minor wording thing though - I'm sure the author knows what he's talking about.
I think the expectation is the important thing: if an update breaks something, the response I want to see is 'sorry, we'll fix that ASAP and try to avoid doing it again'. I get the impression that Debian unstable (and to a lesser extent testing) is only intended for people who're willing to put up with things breaking often.
As I see it, software has to change. The question for distros is just when? A rolling distro sees such changes continuously, and each package may in principle change at any point independently of any other package. Doesn't that mean that at any point, my workflow-critical programs may change their behavior or even stop working together? Sure, I can do upgrades more seldomly, but then I'm left without security updates. The main benefit I feel that non-rolling distros offer is predefined breakage points. I know that Ubuntu Quantal will work the way it works now until I upgrade to the next release. Even simply knowing in what way software is (and will remain) broken is useful.
Now of course, one can stay with LTS and get the same behavior. I guess I'm simply complaining because the 6 month cycle was perfect for me.
At any rate, I would guess that most users only really desire rolling behavior for a few (dozens?) of packages. Ubuntu has that nicely covered with PPAs (and also "special status" rolling packages like Firefox).
The way it's set up now is there is a -proposed pocket where major transitions are being held, so the old notion of having a library upgrade breaking $foo applications doesn't really happen anymore. From my view, I've been running 13.04 everyday and about half way through the development cycle it was better than 12.10. EDIT: There is also a ton of active QA happening in Ubuntu now, we have nice things we never had before like daily builds firing off and having tests run, prerelease pocket test, and (soonish) phased updates.
As far as what you're going to do when your user apps upgrade, you've been doing it on Firefox/Chrome/your phone for ages now, it's about time our desktops caught up!
As an Arch Linux user, I've had no problems with rolling releases. I know that's not much to go on, but I think with a large known company running the show, there is going to be some responsibility here.
Hopefully others can chime in.
If you've got workflow-critical programs you don't want updating, use the LTS release. That's what it's for.
How would the move to PulseAudio have worked in a rolling release? Would someone with a new install end up with PulseAudio, but someone with an existing install, would just never have it installed unless they knew what it was and started poking around for it?
The same for the transition to using evdev for (e.g.) mouse-handling in X.org rather than defining things in xorg.conf.
I know that Debian has lived through these transitions, but my impression was that the packages were setup in a way that allowed people to keep the old config, or move to the new one. That said, it seems like these issues could be more complex than a 'human being' (Ubuntu's target market) would be able to decipher.
That means the same sort of transition of config files and libraries that occur on release upgrade would just happen during a normal package install, which is pretty much how it works anyway (a release upgrade isn't much more than a giant group of packages being upgraded together -- all this logic is stored in the packages themselves).
For most normal upgrades this doesn't matter because the applications are usually fairly insular, but when you get into integrated systems like Desktop Environments it's different. Especially when Ubuntu is making a number of UI/etc changes to the desktop. It would be a bit surprising to have UI parts shift around. When you do a release upgrade, you know that there are going to be some major changes, and there are ChangeLog/notes on the release itself to tell you what major changes there are.
I know. And it could be that I'm just grumpy that I'm losing the great thing Ubuntu gave me: essentially 6 months of frozen Debian Unstable.
Now we might get 18 months of frozen Unstable along with a Testing/Unstable-like rolling thing. Except for the regular-guy-friendly installer, what will Ubuntu really add over Debian anymore then?
What's kept me on Ubuntu for 6 years was that it gave me exactly what I wanted: relatively frequent, but frozen snapshots of Sid! But then again, if I'm in a small minority thinking like this, I won't complain :-)
Lets point out the obvious: the user interaction of a small handheld device is very different than that of a computer with a full size monitor and mouse. Thus any os's on those two devices have to have different UI features. That's it.
Of course the parts of the OS that are not UI can be converged. In that sense Linux is already a fully converged OS.
But you are never going to fully converge the UI of two devices that use vastly different UI methods.
Citation needed. All I see are loud complaints of a minority of mostly non-users.
Real users either use Ubuntu with Unity and are happy with it, or use Ubuntu with an alternative desktop (eg. Xubuntu, Kubuntu, Lubuntu) and are happy with it.
And what happens in-between? What happens in the case of tablets? Notebooks with a touchscreen? Future devices like large format displays with full UI, and future paradigms like gesture control?
Notebooks with a touchscreen can be treated either as tablets with a bunch of integrated USB/Bluetooth peripherals, if you want to use them like MS Surface, or alternatively, a regular notebook with an oversized multitouch-enabled trackpad that happens to be the display. So there's no need for a new UI paradigm there.
Super-large or gesture-controlled devices will probably need their own UI paradigms when they become mainstream.
None of this is meant to imply that every device will need its own, incompatible, radically different UI. Competent designers should be able to come up with a "line-up" of 3-4 UIs that "gracefully degrade" into one another, e.g. a menu bar appears if the device has a mouse, and non-maximized apps are allowed if the screen is larger than X inches. And of course, lots of options for power users, e.g. I want non-maxed apps in my 7-inch device even though the default only triggers them in 10-inch devices, and I want Alt+Tab to switch between windows rather than apps.
Within the proposal, it states that packages would be released when ready, and when stable. This doesn't mean that you are going to live on the bleeding edge like an Arch or Gentoo user may. You're going to get the updates after Ubuntu has done QA on them, and the maintainer will probably be peeking at bug trackers on other distros for tips on issues (Arch and Gentoo effectively expand their QA process).
It is true that there will probably be periodic UI changes in some of the applications. It is not necessarily true that these changes will happen any more frequently than they do now (since each application has different life cycles). What will happen is you will potentially not need to wait 6+ months to get a new version of a package with a fix or improvement that is important to you.
If it lets them spend more time on QA and development rather than arbitrary deadlines and bundling, this sounds like a win-win to me. If and when problems happen, they'll get it figured out and hopefully not repeat their mistakes.
1. Run pacman -Syu
2. Pacman failed.
3. Go to Arch website
4. Learn that I shouldn't have run pacman just then, and now I have to follow these steps to un-break my system
The most notable instance was when Arch decided that /usr/lib should be a symlink to /lib. Yeah, barely recovered from that... it was also a reinforcement of my position that basic utilities should be statically linked.
I finally gave up on Arch when a GRUB update broke my system and locked me out of my encrypted root.
Despite the problems, I still find rolling release vastly preferable. If Ubuntu can bring some of the commercial support quality to a rolling release, they'd alleviate a lot of the difficulties that I have with Arch.
That's the number one thing to remember with pacman, _never_ force an update. That is how 90%  of problems arise.
I got burnt by the glibc mess last summer, where you had to run a specific set of --force things in a given order to get through and if you didn't, well, you lost your /lib (at which point it's game over, since freaking pacman isn't statically linked).
Response in forums or irc (I don't remember) was "It's rolling release, you had three whole months to update, too bad for you".
Atomic releases avoid the problem because everyone will do this kind of breaking changes at the same point, whereas with rolling releases, you can't force people to update all at the same time, so you'll always end up losing people, once your migration path inevitably becomes invalid because of the way everything goes forward.
Because Ubuntu serves such a different target audience, I expect the whole update process will be more hands-free and possibly a little more smooth. Their mission is "Linux for Human Beings" instead of "Linux for hackers and enthusiasts", so this is a necessity.
I've been using my current Arch install as my main desktop since the great systemd switch late last year, and I just have it do an apper autoupdate on Sundays. I don't even do pacman -Syu, I just check the logs after an update before a restart to make sure it went smoothly.
It kind of won me over, that and yaourt at least. I have Ubuntu on my grandparents machine and sshing into it and using apt starts to feel last century after getting used to pacman.
1. They could underestimate the effort needed. Continously integrating the packages from sid and upstream and the own development effort, all without breaking for the user, without having targeted maintainers per package, is quite an effort.
The mechanisms Gentoo has in place to manage this don't seem fitting for Ubuntu.
2. The half-year release circle gives users a sense of the progress made. Even while using a LTS-release, observing the new releases i can see where Ubuntu is heading to, what is changing and what to expect for the next LTS. The releases give Ubuntu opportunity to talk about their changes.
Not saying it can't work or it is a bad idea (I don't have enough experience maintaining, especially not a distro, to judge). I see the positive aspects and the proposal sounds reasonable.
Something to keep in mind is that just because there is an updated package available for something, there is no need to get it updated on day 1. I suspect they've got a good feel for update frequency and priority on various packages. Worse case, they'll probably get it updated and more recent than the six month cycle (or 12 month if you run LTS).
And if there's a major change and some users decide to wait before upgrading (while the bugs get ironed out) it's easier to support them if they all wait at the same version.
The current difference in manpower and mind share is pretty big between these two related distros. Regardless of which one each of us may like better, Canonical has the manpower and mindshare to probably have a bit more success with this model. I really like Debian, so this is not a knock on them (I used Debian on servers for a very long time with great results).
Heck, we may even see Debian and Ubuntu working more closely once they take on this shared dev style (I know, I am pipedreaming, but hey...).
Whether you like them or not, Canonical is probably going to be able to do this a bit better. They really can't afford to not do it any better. The mindshare difference is particularly staggering right now. This isn't a direct measure of mind share, but take this as an example:
Do you realize that Ubuntu uses Debian as a base and most of the effort is actually done by Debian developers?
I've had pretty bad experiences with non-LTS versions of Xubuntu. The last time I had a go on it on my laptop, things broke unpredictably, a little at a time. When I say unpredictably, I mean totally by surprise, not even after performing an update. First, Pulse audio stopped working. Spent a few days trying to fix it, then gave up and reverted to Alsa. Then the graphical login got stuck in an endless loop cycle. So reverted to text login, starting X manually after I'd log in. When wireless stopped working, I gave up. I run the LTS Xubuntu on two other machines and neither of them have had problems like this. I put my laptop on the LTS Xubuntu and it's been good since.
I'm guessing mainstream Ubuntu gets more QA and is more stable than Xubuntu. It might be harder for derivative distros to keep up with a rolling release. Or maybe not... this might give them more flexibility to QA stuff as needed instead of trying to keep up with a new release every six months. As long as Cannonical keeps the LTS cycle going, moving to a rolling release sounds reasonable.
On my older laptop things were breaking on each upgrade. Most problems I had were with the Geforce graphics card included, but wireless stopped working at some point in combination with the router that I have at home (kept working with other routers just fine, so I guess it was a protocol issue). Anyway, it required constant tinkering.
So first of all, on that laptop I gave up on Nvidia's proprietary drivers and went with Nouveau, which for my needs works well. 90% of all problems I ever had were linked to these proprietary Nvidia drivers. Don't know how well Nouveau works with newer Geforce cards, but you should give it a try.
I also have a new Thinkpad with Intel HD 4000 graphics and the Wifi chip is also from Intel. I rejected other models with Geforce cards on purpose. With the latest Ubuntu everything worked flawlessly out of the box on this laptop, graphics, wifi, sound, everything.
People expect Linux distros to run well on any hardware they throw at it and many times it does, but if you're buying a new laptop / PC, do some research first and prefer models for which all drivers are available as open-source, or if you don't want that hassle, search for models certified for Linux. Thinkpads are good (if you get it with Intel HD at least), Dell also sells a couple of models that are Ubuntu-certified, like that new developer-targeted laptop, there's also System76 and I'm sure you can find others.
they are basically adopting the debian scheme, only the stable releases will be more frequent.
I worry that eventually its mobile plans will bury the desktop Ubuntu, move it out of focus, so to say. When that happens, I wonder whither I should turn. Debian?
I tracked stable, so maybe testing would be better in terms of package outdating, but hardware compatibility is Debian's achilles heel right now.
I guess what I'm really asking is what are the possible negative implications of the plan for the more "traditional" Ubuntu users? Many of whom are likely sysadmins trying to maintain consistent package versions across their infrastructure. Sticking to the LTS versions is the first step. But then you would be delaying access to newer package versions without the interim releases.
This clearly means more work for the maintainers, but only because the PPA method is so clearly a hack that there is no expectation that it must just work.
Debian stable + backports is probably closest. Unfortunatelly, there are few backports. Partially, because few people use it and partially because upstream rarely cares for Debian stable.
Obviously, this is less of a problem with the more popular PPAs (Postgres, for example).
I view it differently: Currently, the situation doesn't unexpectedly change for six months :-)
Of course, never in my experience has rolling-release ever worked in anything close to production environments - it's what killed FreeBSD in the data centre, however, if you're a techie with some time on your hands it could be cool.
Mandriva had rolling releases with cooker and mageia has also with cauldron. Sorry for looking like a troll but i never got the hype with this ubuntu distro, i wish HN supported mageia more since distrowatch even reports mageia as the number two more popular distro.