Hacker News new | comments | show | ask | jobs | submit login
Ubuntu Rolling Release Proposal (ubuntu.com)
159 points by hotice 1607 days ago | hide | past | web | 89 comments | favorite



What's interesting is that they are in part re-creating the original problem they were trying to solve.

The complaints about Debian were that they only cut a release like every two years. The Debian response was that people who wanted more timely updates could run the "testing" branch, which was a "rolling" release.

Ubuntu promised to solve the slow release cycles and the instability of a rolling branch by cutting stable and well-composed releases on a rigorous six month schedule.

This announcement sounds a lot like the original Debian model they were trying to get away from.


A difference is that Debians testing branch was a testing branch, and as such did not have the same effort and priority put in to keeping it stable. Even as a rolling release, I would expect Ubuntu's main reopositories to be more stable than debian tesdting.


From extensive experience, debian testing is way more stable than Ubuntu's current LTS release even.


I would like to learn from others: how do you handle security with testing? I like debian testing very much, and occassionally I have the idea of using this in production, too, but I fear the responsibility to take care of many security fixes on my own. here are the high-urgency vulnarable sources for testing:

https://security-tracker.debian.org/tracker/status/release/t...

but even more surprisingly, this list for stable is even longer:

https://security-tracker.debian.org/tracker/status/release/s...

Can anybody explain this?

(Yes, I will take the shame on me and ask this on a debian mailing list, however, maybe anybody has a good explanation here...)


As follows:

1. Read the vulnerability description.

2. Ask yourself if you are vulnerable.

3. No? Don't worry about it.

4. Yes? External mitigation where possible or patch ourselves and commit back to debian (we a project member on our team).

We've not got to 4 yet, but have committed loads of fixes anyway.


OK, but real life (low budget) scenario is:

1. aptitude update && aptitude upgrade 2. ask yourself, if your system is vulnerable now 3. feel the good hope and go ahead.

If I had the budget / time to research every single CVE I would probably not trust repositories at all...

do YOU trust repositories?


Yeah and that's pretty much good enough for most people.

That's what we do for our internal dev systems.


Yeah, security fixes are the one thing that prevents using testing for real world stuff.

Ubuntu is probably going to take care of this in their rolling model.


labels and your expectations aside, testing usually is pretty stable and non-lts Ubuntu releases are sometimes not.at least with testing you can hope for fixes in less than six months - Ubuntu generally doesn't fix bugs until the next release in non-lts releases.


I think that's a very well reasoned proposal and I'd be very happy to see it implemented. I might be unusual, but I find myself in both of the user groups that the proposal identifies. For my 'utility' machines (home server, XBMC host, web servers) I prefer the stability of the LTS series, but for my laptop I'm happy to trade that stability for getting new stuff sooner, particularly new usability features that are getting baked into Unity and other components.

One minor nit: in the section on benefits for Core/MOTU developers it suggests that only 2 releases would need to be supported. I think in reality that would actually be 4 releases, as with a 5 year support commitment for each LTS release there could be up to 3 LTS releases that are still being supported at any point in time. May be just a minor wording thing though - I'm sure the author knows what he's talking about.


Your use case is exactly what many of us Debian users have been doing for a long time. I use Unstable on my laptop and Stable (+ backports) on my VPS for those exact reasons.


Technically, Debian unstable is close to what I want, but I don't want to be using something that comes with 'at your own risk' warnings day to day.

I think the expectation is the important thing: if an update breaks something, the response I want to see is 'sorry, we'll fix that ASAP and try to avoid doing it again'. I get the impression that Debian unstable (and to a lesser extent testing) is only intended for people who're willing to put up with things breaking often.


Well, yes, it comes with absolutely no guarantees. That said, many Debian developers use it, and breakages have been rare at least for the last few years, since I started using it.


I'm sure the decision wasn't taken lightly, and I'm sure Canonical has concluded that an 18 month LTS + rolling makes the most sense for the project. As a user, I am a bit worried about rolling releases though. Maybe someone here can alleviate my worries?

As I see it, software has to change. The question for distros is just when? A rolling distro sees such changes continuously, and each package may in principle change at any point independently of any other package. Doesn't that mean that at any point, my workflow-critical programs may change their behavior or even stop working together? Sure, I can do upgrades more seldomly, but then I'm left without security updates. The main benefit I feel that non-rolling distros offer is predefined breakage points. I know that Ubuntu Quantal will work the way it works now until I upgrade to the next release. Even simply knowing in what way software is (and will remain) broken is useful.

Now of course, one can stay with LTS and get the same behavior. I guess I'm simply complaining because the 6 month cycle was perfect for me.

At any rate, I would guess that most users only really desire rolling behavior for a few (dozens?) of packages. Ubuntu has that nicely covered with PPAs (and also "special status" rolling packages like Firefox).


> The question for distros is just when? A rolling distro sees such changes continuously, and each package may in principle change at any point independently of any other package.

The way it's set up now is there is a -proposed pocket where major transitions are being held, so the old notion of having a library upgrade breaking $foo applications doesn't really happen anymore. From my view, I've been running 13.04 everyday and about half way through the development cycle it was better than 12.10. EDIT: There is also a ton of active QA happening in Ubuntu now, we have nice things we never had before like daily builds firing off and having tests run, prerelease pocket test, and (soonish) phased updates.

As far as what you're going to do when your user apps upgrade, you've been doing it on Firefox/Chrome/your phone for ages now, it's about time our desktops caught up!


>>> As a user, I am a bit worried about rolling releases though. Maybe someone here can alleviate my worries?

As an Arch Linux user, I've had no problems with rolling releases. I know that's not much to go on, but I think with a large known company running the show, there is going to be some responsibility here.

Hopefully others can chime in.


The decision hasn't been made yet. Canonical's CTO may have decided he likes it, but the Ubuntu project hasn't yet given it a pass.

If you've got workflow-critical programs you don't want updating, use the LTS release. That's what it's for.


I think the real issue is when infrastructure things move.

How would the move to PulseAudio have worked in a rolling release? Would someone with a new install end up with PulseAudio, but someone with an existing install, would just never have it installed unless they knew what it was and started poking around for it?

The same for the transition to using evdev for (e.g.) mouse-handling in X.org rather than defining things in xorg.conf.

I know that Debian has lived through these transitions, but my impression was that the packages were setup in a way that allowed people to keep the old config, or move to the new one. That said, it seems like these issues could be more complex than a 'human being' (Ubuntu's target market) would be able to decipher.


It's not even clear that the rolling release will be intended for non-enthusiasts at this point, however as a matter of policy you can be pretty sure that any update applied to the rolling release will need to be tested against users as old as the last LTS.

That means the same sort of transition of config files and libraries that occur on release upgrade would just happen during a normal package install, which is pretty much how it works anyway (a release upgrade isn't much more than a giant group of packages being upgraded together -- all this logic is stored in the packages themselves).


I get the difference between a release upgrade and a rolling release schedule. My point was that when you are creating a release, you can say "this is the point where this change happens." With a rolling release, will a "apt-get upgrade" transition me? What if this was unexpected? What if this breaks my config?

For most normal upgrades this doesn't matter because the applications are usually fairly insular, but when you get into integrated systems like Desktop Environments it's different. Especially when Ubuntu is making a number of UI/etc changes to the desktop. It would be a bit surprising to have UI parts shift around. When you do a release upgrade, you know that there are going to be some major changes, and there are ChangeLog/notes on the release itself to tell you what major changes there are.


> If you've got workflow-critical programs you don't want updating, use the LTS release. That's what it's for.

I know. And it could be that I'm just grumpy that I'm losing the great thing Ubuntu gave me: essentially 6 months of frozen Debian Unstable.

Now we might get 18 months of frozen Unstable along with a Testing/Unstable-like rolling thing. Except for the regular-guy-friendly installer, what will Ubuntu really add over Debian anymore then?

What's kept me on Ubuntu for 6 years was that it gave me exactly what I wanted: relatively frequent, but frozen snapshots of Sid! But then again, if I'm in a small minority thinking like this, I won't complain :-)


This repeated notion of a truly converged OS is really annoying and it is very disappointing to see Ubuntu continue to pursue it despite the loud complaints of most of their users.

Lets point out the obvious: the user interaction of a small handheld device is very different than that of a computer with a full size monitor and mouse. Thus any os's on those two devices have to have different UI features. That's it.

Of course the parts of the OS that are not UI can be converged. In that sense Linux is already a fully converged OS.

But you are never going to fully converge the UI of two devices that use vastly different UI methods.


> despite the loud complaints of most of their users.

Citation needed. All I see are loud complaints of a minority of mostly non-users.

Real users either use Ubuntu with Unity and are happy with it, or use Ubuntu with an alternative desktop (eg. Xubuntu, Kubuntu, Lubuntu) and are happy with it.


>Lets point out the obvious: the user interaction of a small handheld device is very different than that of a computer with a full size monitor and mouse.

And what happens in-between? What happens in the case of tablets? Notebooks with a touchscreen? Future devices like large format displays with full UI, and future paradigms like gesture control?


At this stage the key differentiation point is touch (ie. your finger as the pointer) v. the pointer (mice/keyboard, etc.) The UI for these two categories is naturally different, and convergence would be possible only when the UI can handle both. Even in the case of notebooks with touchscreens, the UI has to handle both requirements well, that of the finger and of the mouse/keypad. I personally feel Ubuntu is neglecting the latter. In time, that may well be the right choice. At present, it's definitely only part of the story.


I think the answer re tables is the meta-answer for linux generally: copy somebody else who has solved the problem. The Apple IOS-OSX dichotomy comes to mind as something to ape.


You just make another couple of custom UIs for "in-between" cases.

Notebooks with a touchscreen can be treated either as tablets with a bunch of integrated USB/Bluetooth peripherals, if you want to use them like MS Surface, or alternatively, a regular notebook with an oversized multitouch-enabled trackpad that happens to be the display. So there's no need for a new UI paradigm there.

Super-large or gesture-controlled devices will probably need their own UI paradigms when they become mainstream.

None of this is meant to imply that every device will need its own, incompatible, radically different UI. Competent designers should be able to come up with a "line-up" of 3-4 UIs that "gracefully degrade" into one another, e.g. a menu bar appears if the device has a mouse, and non-maximized apps are allowed if the screen is larger than X inches. And of course, lots of options for power users, e.g. I want non-maxed apps in my 7-inch device even though the default only triggers them in 10-inch devices, and I want Alt+Tab to switch between windows rather than apps.


That would be why they created or are in the process of creating four entirely different UIs for different devices - Ubuntu Desktop, Ubuntu Phone, Ubuntu Tablet, and Ubuntu TV (if that's what it's called) . Have you actually been following what you're talking about? There are different UIs, and they are only converging the underlying codebase. What exactly are you talking about?


I'll miss the bi-yearly fun with trying out a new release, and the name alliteration, but I think overall it's a smart move for Ubuntu. Does anyone have any reasons why they shouldn't move toward this? I'm curious.


A lot of the counter argument is based on FUD and bad experiences with OTHER rolling release distros. I'm not sure we can look at Gentoo or Arch as examples of what Ubuntu would be aiming to do (and I say this as an Arch user).

Within the proposal, it states that packages would be released when ready, and when stable. This doesn't mean that you are going to live on the bleeding edge like an Arch or Gentoo user may. You're going to get the updates after Ubuntu has done QA on them, and the maintainer will probably be peeking at bug trackers on other distros for tips on issues (Arch and Gentoo effectively expand their QA process).

It is true that there will probably be periodic UI changes in some of the applications. It is not necessarily true that these changes will happen any more frequently than they do now (since each application has different life cycles). What will happen is you will potentially not need to wait 6+ months to get a new version of a package with a fix or improvement that is important to you.

If it lets them spend more time on QA and development rather than arbitrary deadlines and bundling, this sounds like a win-win to me. If and when problems happen, they'll get it figured out and hopefully not repeat their mistakes.


I don't think it's so much on the bleeding edge in Arch as most people think and packages are usually upgraded in such a way so that dependencies don't cause problems in other packages (at least from my experience). The key has always been to always do a system upgrade and not upgrade individual packages, i.e. always run pacman -Syu or pacman -Syu <packagename>. In four years I've never seen anything break (knock on wood) aside from a minor issue with a pcre upgrade sometime ago. That has just been my experience though, I run Arch on my desktop, laptop and about 10 vps servers.


My experience with Arch has frequently been:

1. Run pacman -Syu

2. Pacman failed.

3. Go to Arch website

4. Learn that I shouldn't have run pacman just then, and now I have to follow these steps to un-break my system

The most notable instance was when Arch decided that /usr/lib should be a symlink to /lib. Yeah, barely recovered from that... it was also a reinforcement of my position that basic utilities should be statically linked.

I finally gave up on Arch when a GRUB update broke my system and locked me out of my encrypted root.


Yeah I got caught by forcing the /usr/lib update too. Went away for a month, came back and did a full upgrade, failed. Checked the website, the news had dropped of the front page, forced update, broke it horribly.

Despite the problems, I still find rolling release vastly preferable. If Ubuntu can bring some of the commercial support quality to a rolling release, they'd alleviate a lot of the difficulties that I have with Arch.


> forcing the /usr/lib update

That's the number one thing to remember with pacman, _never_ force an update. That is how 90% [citation needed] of problems arise.


Arch is very good if you run it as your main system and keep it updated very frequently. Forget it for six months (because it's an unused multiboot, or whatever) and try to do an upgrade, and it might be more painful.

I got burnt by the glibc mess last summer, where you had to run a specific set of --force things in a given order to get through and if you didn't, well, you lost your /lib (at which point it's game over, since freaking pacman isn't statically linked).

Response in forums or irc (I don't remember) was "It's rolling release, you had three whole months to update, too bad for you".

Atomic releases avoid the problem because everyone will do this kind of breaking changes at the same point, whereas with rolling releases, you can't force people to update all at the same time, so you'll always end up losing people, once your migration path inevitably becomes invalid because of the way everything goes forward.


Couldn't you solve that problem with some type of repository snapshoting/versioning. So if you try to update more than, say, 1 month, the system installs the updates incrementally.


That is precisely what the Ubuntu proposal describes.


Arch has been overall very good for me, but there have been breakages (and I experienced a whole lot more with Gentoo, though that was a while ago). Arch's systemd migration in particular was somewhat painful and the documentation wasn't great for the process (it has improved somewhat since). I seriously doubt that Ubuntu would expect their users to do what I had to do to apply that update (since Arch and Ubuntu serve very different target audiences). This is no knock on Arch, their approach is appropriate for their audience.

Because Ubuntu serves such a different target audience, I expect the whole update process will be more hands-free and possibly a little more smooth. Their mission is "Linux for Human Beings" instead of "Linux for hackers and enthusiasts", so this is a necessity.


> I expect the whole update process will be more hands-free and possibly a little more smooth.

I've been using my current Arch install as my main desktop since the great systemd switch late last year, and I just have it do an apper autoupdate on Sundays. I don't even do pacman -Syu, I just check the logs after an update before a restart to make sure it went smoothly.

It kind of won me over, that and yaourt at least. I have Ubuntu on my grandparents machine and sshing into it and using apt starts to feel last century after getting used to pacman.


There are two points against this:

1. They could underestimate the effort needed. Continously integrating the packages from sid and upstream and the own development effort, all without breaking for the user, without having targeted maintainers per package, is quite an effort.

The mechanisms Gentoo has in place to manage this don't seem fitting for Ubuntu.

2. The half-year release circle gives users a sense of the progress made. Even while using a LTS-release, observing the new releases i can see where Ubuntu is heading to, what is changing and what to expect for the next LTS. The releases give Ubuntu opportunity to talk about their changes.

Not saying it can't work or it is a bad idea (I don't have enough experience maintaining, especially not a distro, to judge). I see the positive aspects and the proposal sounds reasonable.


The packaging is actually likely to be a lot easier, which is one of the big perks to this model. They only get stuck supporting the LTS releases, instead of having to backport fixes to every release in the last X years (was it 4?).

Something to keep in mind is that just because there is an updated package available for something, there is no need to get it updated on day 1. I suspect they've got a good feel for update frequency and priority on various packages. Worse case, they'll probably get it updated and more recent than the six month cycle (or 12 month if you run LTS).


If major changes (like dumping Gnome for Unity) get triggered by the same button as security patches etc users might be reluctant to press that button.

And if there's a major change and some users decide to wait before upgrading (while the bugs get ironed out) it's easier to support them if they all wait at the same version.


I definitely support this. Having a rolling release was the primary reason why I decided to go with debian (unstable) vs (x)ubuntu a few weeks ago. My previous install (oneiric; non-LTS) was about to become unsupported, and the last time I decided to ignore the official support status (years ago... I think it was dapper?), the main repository upped and vanished one day. That was not a fun morning.


The problem is that Debian can be faster or slower whilst Ubuntu LTS releases are on a schedule. For example I believe the currently stable Debian release includes Python 2.6 because 2.7 just missed it. I prefer Ubuntu's LTS releases to Debian because they are predictable.


Yes, I agree that ubuntu's time based releases are generally a better experience that debian's "whenever it's ready" stable releases. Debian's testing and unstable branches are rolling releases though, and those were the only alternatives I was actually considering.


Although, I'm not sure we can even look at the Debian when trying to predict what a rolling release Ubuntu would look like (even considering that they are "relatives").

The current difference in manpower and mind share is pretty big between these two related distros. Regardless of which one each of us may like better, Canonical has the manpower and mindshare to probably have a bit more success with this model. I really like Debian, so this is not a knock on them (I used Debian on servers for a very long time with great results).

Heck, we may even see Debian and Ubuntu working more closely once they take on this shared dev style (I know, I am pipedreaming, but hey...).


I was under the impression that the Debian team is much larger than Cannonical. This is why they have always said that they could not do what they do without Debian. The biggest problem with a rolling release is that you constantly have to reboot the server to apply kernel upgrades, which you have to do less with LTS (though the first 6 months after the release it feels like a weekly chore).


I am sure they are larger, but I highly doubt they invest as much time (full-time), and do it as cohesively as Canonical. That's without factoring in all of the volunteers and other unpaid contributors. Debian has a notoriously slow, bikesheddy decision-making process, also.

Whether you like them or not, Canonical is probably going to be able to do this a bit better. They really can't afford to not do it any better. The mindshare difference is particularly staggering right now. This isn't a direct measure of mind share, but take this as an example:

http://www.google.com/trends/explore#q=debian%2C%20ubuntu%2C...


Doesn't Linux support hotswapping kernels?


Ksplice [1] lets you do that. For a while they had very limited support for it, but I just checked and they seem to support Ubuntu on the desktop as well [2]. Once this comes to the server edition and/or is bundled by Canonical, using Ubuntu would become much smoother.

[1] http://www.ksplice.com/

[2] http://www.ksplice.com/uptrack/download-ubuntu


> The current difference in manpower and mind share is pretty big between these two related distros

Do you realize that Ubuntu uses Debian as a base and most of the effort is actually done by Debian developers?


I suppose derivative distros like Xubuntu and Edubuntu will need to follow suit.

I've had pretty bad experiences with non-LTS versions of Xubuntu. The last time I had a go on it on my laptop, things broke unpredictably, a little at a time. When I say unpredictably, I mean totally by surprise, not even after performing an update. First, Pulse audio stopped working. Spent a few days trying to fix it, then gave up and reverted to Alsa. Then the graphical login got stuck in an endless loop cycle. So reverted to text login, starting X manually after I'd log in. When wireless stopped working, I gave up. I run the LTS Xubuntu on two other machines and neither of them have had problems like this. I put my laptop on the LTS Xubuntu and it's been good since.

I'm guessing mainstream Ubuntu gets more QA and is more stable than Xubuntu. It might be harder for derivative distros to keep up with a rolling release. Or maybe not... this might give them more flexibility to QA stuff as needed instead of trying to keep up with a new release every six months. As long as Cannonical keeps the LTS cycle going, moving to a rolling release sounds reasonable.


One thing to keep in mind is that you're going to have far fewer problems with compatible hardware.

On my older laptop things were breaking on each upgrade. Most problems I had were with the Geforce graphics card included, but wireless stopped working at some point in combination with the router that I have at home (kept working with other routers just fine, so I guess it was a protocol issue). Anyway, it required constant tinkering.

So first of all, on that laptop I gave up on Nvidia's proprietary drivers and went with Nouveau, which for my needs works well. 90% of all problems I ever had were linked to these proprietary Nvidia drivers. Don't know how well Nouveau works with newer Geforce cards, but you should give it a try.

I also have a new Thinkpad with Intel HD 4000 graphics and the Wifi chip is also from Intel. I rejected other models with Geforce cards on purpose. With the latest Ubuntu everything worked flawlessly out of the box on this laptop, graphics, wifi, sound, everything.

People expect Linux distros to run well on any hardware they throw at it and many times it does, but if you're buying a new laptop / PC, do some research first and prefer models for which all drivers are available as open-source, or if you don't want that hassle, search for models certified for Linux. Thinkpads are good (if you get it with Intel HD at least), Dell also sells a couple of models that are Ubuntu-certified, like that new developer-targeted laptop, there's also System76 and I'm sure you can find others.


considering the tipping point of me leaving buntu was being tired of the upbreak release process, I second this idea.

they are basically adopting the debian scheme, only the stable releases will be more frequent.


Debian also has a roughly 2 year release frequency. 2005, 2007, 2009, 2011. Stable is already frozen so 2013 is likely.


I thought Ubuntu's original reason for existing was because Debian releases don't happen often enough?


It existed because testing wasn't stable enough and stable wasn't fast enough. The 2 year LTS releases are basically just the slow stable thing carried over, but having 2 layers of rolling release (like Arch) where you have a base repo and a testing repo lets them having the rolling release without all the continuous integration problems if it hits enough testing users first.


I watch Ubuntu's recent steps with trepidation. I love using it for a home Linux machine (indeed upgrading just to LTS-es), and I could not care less about its "convergence" aspirations. I don't want Ubuntu on my phone or my tablet, or at least I don't care.

I worry that eventually its mobile plans will bury the desktop Ubuntu, move it out of focus, so to say. When that happens, I wonder whither I should turn. Debian?


I used to use Debian, and I'm not sure I'd go back. On one hand, the stability was wonderful (Debian was pretty much bulletproof for me), but packages were, for stable, quite old. Hardware compatibility isn't quite as good: hardware (including a USB Wi-Fi dongle) that Ubuntu autodetects and has no trouble with became an adventure to get working.

I tracked stable, so maybe testing would be better in terms of package outdating, but hardware compatibility is Debian's achilles heel right now.


Depending on what you do with your machine(s), it might make sense for you to go with whatever versions come with the vast majority of the packages, but for the few that matter to you, manage them yourself. For example, I use Oracle's Virtualbox PPA, and it just updated as a normal software update last night. That's 4.2.x, compared to Ubuntu's 4.1.x. For things that aren't PPA'd, you'd need to pay closer attention. It's an option.


I wonder how much this proposal is driven by Ubuntu's push into the mobile space versus improving the Ubuntu project at large. Clearly it's possible rolling releases would aid both.

I guess what I'm really asking is what are the possible negative implications of the plan for the more "traditional" Ubuntu users? Many of whom are likely sysadmins trying to maintain consistent package versions across their infrastructure. Sticking to the LTS versions is the first step. But then you would be delaying access to newer package versions without the interim releases.


One thing I'd add as a suggestion: if they do a rolling release, hot-rebuild the package so that if you download an "Ubuntu rolling" ISO, you get a 100% up to the minute copy with all the latest security patches and so forth. This so that starting a new Ubuntu install is not annoying (which it would be if you basically had to throw away and redownload the whole OS as updates as soon as you installed).


There's already a daily ISO of the development release, so I imagine that would carry on under a rolling release.


Yes Please! I'm sick of PPA-ing for updated packages.


Really? I've always felt that they give me the perfect balance: the distro itself is non-rolling, so breakages (possibly) happen at known intervals, yet PPAs let me "go rolling" for a subset of packages where I want the latest and am prepared to handle breakage at any time.


I wonder if there is a way to build this process more into the package management system. Maybe have to sets of repositories, a 'stable' repository that would behave like current, non-rolling repositories, and a 'rolling' repository that would get new versions of software once they've been tested. By default, software would install from the stable repositories, but you can do something like 'apt-get make_rolling python', and get the rolling release version of python. Such a system should then be able to handle dependencies that would need to be made rolling as well.

This clearly means more work for the maintainers, but only because the PPA method is so clearly a hack that there is no expectation that it must just work.


This is similar to what the backports repository provides, After you add the repository, apt does not install anything from it by default. Instead, you have to tell it to, e.g. apt-get -t squeeze-backports install "package"

https://help.ubuntu.com/community/UbuntuBackports

http://backports-master.debian.org/


Apt supports this. However, no distribution officially supports that scheme.

http://wiki.debian.org/AptPreferences

http://serverfault.com/questions/22414/how-can-i-run-debian-...

http://blog.drewwithers.com/2011/06/mixing-debian-stable-and...

Debian stable + backports is probably closest. Unfortunatelly, there are few backports. Partially, because few people use it and partially because upstream rarely cares for Debian stable.


Can't you do this with Synaptic? Include apt lines for multiple versions, then select "prefer versions from Oneiric". That lets you force a version for specific packages while not updating the whole system.


You really are at the mercy of the PPA maintainer, too. For some packages, the maintainer's interest may not be there for more than a release or two. That's a pain, when I could just be pulling from multiverse/universe/whatever it was called.

Obviously, this is less of a problem with the more popular PPAs (Postgres, for example).


PPAs should be for cutting-edge stuff, not stuff that was stable 4 months ago and hasn't been updated in "main" yet.


But "stuff that was stable 4 months ago" is not a trivial case either. Suppose A version 1.1 was stable 4 months ago, but that the version of B that works with A 1.1 is horribly buggy still, and the stable version of B needs A 1.0. Do we upgrade A? I can imagine this becoming very tough to mange with 20k-30k packages. I hope Canonical can deal, but I do fear that the users will suffer.


If it's a library, it should be trivial to have the latest A 1.1 installed alongside older A 1.0 and stable B.


With potentially hundreds of different packages playing the role of B, this could eat up a lot of manpower. What if the breakage is subtle and not well known upstream? While it could work, it sounds like something that would take a lot of QA work by Canonical.


That problem already exists, except currently there is generally a six month minimum before it gets resolved.


> That problem already exists, except currently there is generally a six month minimum before it gets resolved.

I view it differently: Currently, the situation doesn't unexpectedly change for six months :-)


This could work, rolling-release + LTS. Archlinux seems to be doing quite well under the rolling-release auspice, so why not.

Of course, never in my experience has rolling-release ever worked in anything close to production environments - it's what killed FreeBSD in the data centre, however, if you're a techie with some time on your hands it could be cool.



Not totally dead, it still has some strongholds left, but I'm sure Netflix doesn't use the rolling-release side of it (ports mainly). :-)


No one has ever tried this with an OS before.


What do you mean? Arch and Gentoo are two pretty successful distributions and both use a rolling release approach.


Sarcasm doesn't translate very well over the Internet.


Debian testing and unstable.


I'd like to see them balance out dropping the interim releases with doing slightly more frequent LTE releases. Maybe every 1 or 1.5 years instead of 2. But obviously there might be manpower issues there. I just think they're abandoning a middle ground of stability that's still a very valuable space.


I really like the way FreeBSD does it: stable base system but all (ok, most) 3rd party software is still fresh.


This seems like a reasonable proposal to me. I've starting using the LTS releases exclusively for my desktop and server systems. I'm planning to use the Ubuntu phone heavily, and would love that device to be on a rolling-release schedule.


Canonical should just wait until the 64 bit of 14.04 LTS for ARM is ready before they put it on mobile. Should greatly simplify their lives when it comes to upgrading later.


I can't use Ubuntu, I've tried countless times, the experience sucks. I'm trying mint, and i can't even access latest libraries, everything is old, no partition resizing during installation, lack of distro tools and lack of polish; in mageia everything is fresh and if it not the free support quickly updates them.

Mandriva had rolling releases with cooker and mageia has also with cauldron. Sorry for looking like a troll but i never got the hype with this ubuntu distro, i wish HN supported mageia more since distrowatch even reports mageia as the number two more popular distro.


If you need the bleeding edge of a package, install that package from a pinned archive (or similar). If you need the bleeding edge of everything, I don't believe you. If you just want the bleeding edge because 'hey, it feels cool', then you're not the target demographic for ubuntu - something they stated from the outset.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: