Hacker News new | comments | show | ask | jobs | submit login
Ubuntu 13.04 Raring Ringtail Released (ubuntu.com)
368 points by zeis 1494 days ago | hide | past | web | 288 comments | favorite

Warning: according to the known issues in the release log [0]:

* Skype won't start (without a fix [1])

* Google Chrome won't install (without a fix [2])

[0] https://wiki.ubuntu.com/RaringRingtail/ReleaseNotes

[1] https://bugs.launchpad.net/ubuntu/+source/skype/+bug/1155327

[2] https://code.google.com/p/chromium/issues/detail?id=226002#c...

"Actually, having a workaround by definition lessens the severity, as if someone wants to still run Skype, they can."

I don't think everyone is on the same page about Ubuntu being a consumer distro and not a hacker's only distro... I have definitely noticed a decrease in the stability of Ubuntu releases over the years and it's attitudes like this that seem to be in part responsible for it.


Oh I kept reading, this is better:

"Rainer, a better solution for international calls is to stop using Skype (which is closed source, and frequently has problems with its Linux client) and change to a service that supports standard protocols (SIP etc.), e.g. using Ekiga"

Yes, the client should change their software and infrastructure to better suit your product...

This makes it pretty much impossible for me to "sell" Ubuntu to any of my friends who aren't programmers AND tinkerers (many of my programers are too busy being productive to be interested in dicking around with making things work anymore, and increasingly I hold this view).

> many of my programers are too busy being productive to be interested in dicking around with making things work anymore, and increasingly I hold this view

This. I switched to OS X from Ubuntu about a few months ago simply because of this reason. I was too tired of tinkering and configuring endless .conf files, adjusting my GNOME/Xfce panel layouts for optimum productivity, and changing my display manager because all of them sucked.

In my final days with Linux I was running Ubuntu Minimal with twm -- I went completely old-school. Fewer things at the core, fewer things to go wrong.

I caved and bought a MacBook Air. While I miss the endless configurability of Linux, I find that most of the time I spend trying to configure OS X is only spent to make things more suitable for my workflow -- changing the number of desktops I have, changing the applications in my dock, or adjusting Exposé to show all app windows.

I never have to configure something because it breaks or isn't compatible with something else. I could have gone the FreeBSD way, but then, well, you know. OS X just works. As a Ruby and Java developer I find it perfect. I probably will try the 13.04 release on the Mac (I tried the beta and liked it), and I'll see how it goes.

It looks like a solid release, so I'd really like to keep it (but I probably won't because I'm cheap and only have the 120gig SSD).

Except for a few driver issues, (Nvidia and Optimus, doh!), I haven't had a issue with Ubuntu or Mint. Any other issues with configuration, I would have also had with Windows or OS X since they were application or server specific.

But I don't spend days tweaking the UI, I have a baseline of UI expectations that's not too difficult to meet. (But Windows 8 doesn't meet it... 'nother story for 'nother day...)

I've had tons of issues. Especially around power management, multi-monitor support etc. When I "Hibernate" I also need to roll a d10 to know if it'll actually come back up. This is anecdotal but on same hardware there are no issues under Windows and everything works as expected. Eclipse seems less stable and slower under Ubuntu which is probably related to the Java run time. Firefox also less stable. Webcam sometimes doesn't work. Audio suddenly stops playing until you touch the volume. Couldn't get WebEx working under Ubuntu (tried lots of suggested workarounds). The kind of stuff that just works in other OSes.

On what hardware?

I have the same issues on an Asus laptop. It was well supported on 12.04 LTS but the 12.10 is obviously a big failure. Not to speak about the system that slow down by itself, some swap mess I don't have time to fix, and the overall growing instability of the system tools. Memcheck segfault, then the report tool activate itself, inspect, segfault, launch again... I also upgraded a pc of a friend from an good old athlon xp to a new shiny i3, and guess what the official, libre, graphic driver is just unable to run correctly, apparently because of a bad switch in an internal opcode.

Something is wrong here. The massive shift of devs from linux to OSX should warn some people somewhere about the overall quality of their stuff. A quality that we have been proud for decades.

Dell laptop here (Core i7 + the dreaded Optimus)...

Have you checked out this? http://bumblebee-project.org/

my Canon Printer wont work properly in UBUNTU.

My printer, scanner, graphics tablet, and sound card won't work in windows (the printer manufacturer explicitly refuses to support 64-bit, the graphics tablet bluescreens win7 when plugged in, the others just do nothing). All work flawlessly out of the box on Ubuntu. Hooray for anecdotal evidence \o/

You don't like Ubuntu dictating a desktop environment to you. So you fiddle with other things for a long time, at your own option. Then you pay Apple to dictate a desktop environment to you. Hmm...

He explicitly said he misses the configurability, so I wouldn't blame him for that.

As a Mac user that runs triple boot, the "just works" mentality in OS X is present only as long as you don't step out of the typical use case scenarios. It's great to say be a writer on OS X; there are very good tools and apps to write in a minimal, esthetic environment (WriteRoom and others). On the other hand, as a "hacker", I've always had trouble in OS X -- from Page Up/Page Down not registering properly in Terminal, to full NTFS and EXT4 support (FUSE used to work, but I think it was not maintained) to getting vim, gcc and other tools of the trade working on OS X (yes, there are ports-like systems on OS X, but hardly ones that I'd say "just work").

On the other hand, in pure Linux world, brightness controls, backlighting controls, volume controls and multiple sound card support were all rather tricky when I tried using Gentoo on the same Mac for a time. Some of those I solved, but some I just gave up and was content that it kinda-but-not-completely works.

Ubuntu seems to be tested enough on Macs, so those things work on my laptop now without having to configure too much. It may not be the perfect "just works" experience, and I have seen a lot of things going the wrong way on Ubuntu, but currently it's a good mix of being easy to maintain and being easy to start doing actual work with.

I didn't leave the Linux world because I didn't like Ubuntu dictating a desktop environment to me. I left because OS X suits my needs better than any Linux distribution (except perhaps OpenSUSE) could.

Logical fallacy. You've built a strawman, based your argument off it, and then contradicted yourself.

No they didn't. Where?

It's strange because the overwhelming consensus so far has been that the latter releases have been much more tested and stable than any before, with the sole exception of LTS releases. Still, it's always been good advice to hold off on upgrading until a release has been out for a month or so, and for most users staying with the LTS release is probably good enough anyway.

There's a huge QA process for Ubuntu now -- packages have to go through a large swath of automated tests, and the alpha/beta versions are kept working on a daily basis. It was only 2 years ago or so that most Ubuntu developers didn't even expect the unstable release to be usable until the feature freeze milestone had well passed.

>many of my programers are too busy being productive to be interested in dicking around with making things work anymore, and increasingly I hold this view

This is a sign of maturity. Time is our one true finite resource, and we need to prioritize what we do with it.

I once bought a laptop which ran Windows 7. The Wifi did not work due to their drivers being broken.

The correct move then is to return it.

No. The correct move was to install Ubuntu, which had the correct drivers. Point being that all software has issues. More so if the software is as complicated as an OS with two releases per year.

You bought a product and it did not work as advertised. The recommended path would have been to have it serviced or return it to the distributor. You could also attempt to install the correct working drivers, as this most likely would not void your warranty. However, changing the entire OS is a very big not-recommended move on a recently purchased laptop that's having such a simple problem.

Why post something like this? I don't get it. He stated what he did to fix the problem and it apparently worked.

So his recommendation is proven to work and yours is....?

It's smothering the candle with a firehose. It's a very destructive solution for which avenues have been paved before-hand to solve such problems. Did it work? Yes. To fix one small problem (malfunctioning WiFi) he rendered himself incapable of doing other things (use Microsoft compatible software, and, no, I'm not including WINE).

In the end, if his purpose was to install Ubuntu, then good for him. However, I wouldn't give it as general advice out of voiding their warranty alone.

You missed the point by two blocks. Which is that OS have bugs. In my case, i was going to instalk ubuntu either way. I was making a counter argument against a comment above. Still, installing another OS does not void the warranty where i reside.

Guys, Skype is owned by Microsoft, and the Linux version hasn't been touched in like 2 years. I'm frankly not surprised it's b0rked.

So... yes, the user should use software that isn't deprecating support for their platform.

They are releasing steady updates for it. 4.1 came out Nov 2012. That's hardly 2 years. That's after the last Ubuntu release. And it's fully supported for the platform. What are you talking about?

Well that's news to me. The last I looked was before November, I guess, and before that it still had the same old crufty UI from like 5 years ago. The userbase had been pretty up-in-arms about it for a while too: http://community.skype.com/t5/Linux/Vote-here-for-linux-skyp...

Edit: also, they're still sitting on version 5, which introduced group video calling. It's a pretty sad situation overall.

Much as I can't stand to say something nice about Microsoft, the Skype updates at least on Ubuntu 12.04 have worked fine.

Ubuntu has no control over the Skype project. They aren't obligated to support every third-party product when it is impossible to do this.

I don't think this is the problem with recent Ubuntu releases.

Ubuntu has no control over the Skype project. They aren't obligated to support every third-party product when it is impossible to do this.

That's an interesting attitude.

Backwards compatibility is a generally regarded as a feature. If you read Raymond Chen's blog about the length's Microsoft went to in order to support old software on new versions of Windows you'll see how important some operating system vendors regard it.

Linux generally seems to have an interesting attitude to backwards compatibility. "It works on old hardware" is something that used to be regarded as a huge benefit of Linux.

These days - especially for Ubuntu - it doesn't seem to be regarded as a feature at all. Additionally Ubuntu seems to introduce breaking API changes in every release. I find that to be quite a surprising route to take for an OS that is trying to build desktop share.

I'd be quite interested to hear why people think they do this. Is backwards compatibility just not seen as important at all or is there a bigger strategy here?

Actually, they should try.

They are not obligated, but it is in their best interests to build a reputation of not breaking things.

I do think that Linux lacks a reputation of not breaking existing programs as much as it lacked a reputation of ease of use before Ubuntu.

Now, ease of use, and not breaking things? I bet that kind of reputation is worth millions of dollars.

Even Linus agrees about it, when it comes to the kernel breaking userland programs. The kernel developers have no control over userland programs. Nonetheless, the kernel developers try very hard not to break userland programs.

To be fair, I have definitely noted a decrease in stability of OSX releases of the last few years too.

At least they're consistent in not liking Microsoft.

Ubuntu bug #1, for reference https://bugs.launchpad.net/ubuntu/+bug/1

"Yes, the client should change their software and infrastructure to better suit your product..."

The person you quoted isn't even a developer as far as I can tell.

So, a "hacker's distro" should have bugs in it? No thanks, I'll take functional software.

I don't use Skype (imo.im works much better, and you don't have to install any programs), so I don't know anything about it, but Chrome seems to work.

Jitsi (jitsi.org) does video calling over Jabber and others.

Though it's not web based like imo.im, you also don't have to give your access credentials to a third party.

imo.im and similar services have one disadvantage: you give keys to all of your communications to additional 3rd party (as if microsoft wasn't enough)

3rd party? Microsoft own Skype.

You give your keys to imo.im.

They said additional 3rd-party, which implies MS is a third-party?

It is. Isn't it?

They aren't. They're the provider of the service.

...and a third party to your conversation with a friend.

Thanks for the plug, imo.im looks really cool! Can't believe I didn't know about it before.

It seems to be the spiritual successor to Meebo webchat. So far its the only web-chat site that I have found that supports Jabber. I would absolutely recommend it!

I use their Android app to stay signed into my professional and personal IM accounts at the same time. It's the best I've found.

imo.im is pretty great. I think they even had (web based) video calls with Skype accounts, at least until Skype restricted that type of access to 3rd parties.

Why can't I find a download link for linux anywhere?

Oh, there is no download for linux... I'm not much for browser chatting.

The Skype bug looks like a Multiarch problem and there appears to be a workaround in the bug report:

    LD_PRELOAD=/usr/lib/i386-linux-gnu/mesa/libGL.so.1 skype
which is trivial to implement either in a custom .desktop file dropped in ~/.local/share/applications or a custom launch script in /usr/local/bin (or whatever comes before /usr/bin in your path).

Trivial maybe if you're a power user that knows Linux and Gnome's desktop shortcuts by heart and isn't afraid of hacking, but definitely not for the regular user Ubuntu was intended for a long time ago.

Said user should then have a system administrator to set up the system. Really, you either read the manual to learn such things or you get someone to do them, expecting a system as complex as a computer to work out-of-the-box is nonsense.

What you seem to be saying is that in order to use a personal computer, you should be an expert.

I disagree, and I'm glad that most people disagree as well. User-friendliness is important regardless of what system it is, even Linux. Have you noticed that the most successful distributions of Linux are the ones that are most friendly to beginners?

This elitism is destructive and serves only to make the "experts" smug and keep everyone else in the dark. I've found that the real experts are the ones who are writing guides for solving issues so that beginners can solve them too. That way, everyone benefits... and the beginners get to learn too. Almost all of my experience with config files has come from having a problem, Googling it, copy-pasting some commands into the terminal, and then becoming interested and messing around with it.

I had a taste of those elitist attitudes a few years ago when I was looking up how to add a user on an Ubuntu server and the top result for a search on "ubuntu add user" was this page:


Ubuntu has two commands for this, useradd and adduser, where adduser is the easier one for newbies, while useradd is more flexible but has a lot of ways you can get it wrong. In fact, 'man useradd' recommends using adduser instead.

Unfortunately, that how-to page starts with useradd, and the first example on the page is:

useradd <username>

It isn't until later on the page that it mentions that there are some options you must add to the command if you want it to set up everything correctly for a typical user. And if you leave these options out, the page doesn't explain how to fix it after the fact.

Finally, near the bottom of the page it mentions the adduser command that would have worked with no special options and no hassles.

So, being at the time a Linux newbie, I worked through the how-to and tried the useradd command it suggested, got myself into a situation where I wasn't sure exactly what I'd really done and hadn't done, then finally discovered the adduser command I should have used in the first place. I posted a reasonably polite complaint that the how-to had led me down the wrong path and it ought to start with the recommended adduser command instead of just mentioning it in passing at the end.

The responses I got were fairly eye-opening. (Click the "show archived reader comments (36)" link to see them.)

This one was my favorite:

> Michael Greary [sic] you are a D&%K H#$&D. You’re the 1 that messed it up no1 else, and anyone else for that matter.

> Hopefully you’ve learnt by now that when blindly running linux commands you should read the whole article to make sure it’s what you want first.

> Blaming it on other ppl isn’t gonna help either. You should’ve said it by admitting your mistake, asking for a way to rectify it.

> (sigh) I bet you have no friends

I have to admit he was right. Clearly, I am not worthy of using Ubuntu!

That article is infuriating.

I searched for an alternative. The second Google result for 'add user ubuntu' is: https://help.ubuntu.com/community/AddUsersHowto

There are problems:

* The article has a pants title - no novice in their right mind is going to click the link over the first result with the better title

* It begins with a tag saying it's out of date

* It's marked as immutable so those who care can't make it better.

What can we do to make these terrible articles go away?

Of course, you tacitly skipped over the other helpful comments and instead chose to concentrate on the worst comment in that thread. However, that doesn’t render the message delivered there wrong in any way – read the whole article/how-to before starting, just as you read the whole recipe before starting to cook.

So, yeah, you

a) didn’t read the manuals supplied with your distribution

b) instead went directly to a random third party site via Google

c) blindly trusted said site without fully reading it

d) messed up while doing so.

You should have gotten a computer-literate friend to help you, just as you don’t fix your car by looking for an howto online and dismissing the manual you can usually find in the glove compartment.

Thanks (belatedly) for your comments.

The thing is, of course I know I screwed up. I don't need you or anyone else to tell me that. I already know it.

But that has nothing to do with my complaint: the top match for a Google search for "ubuntu add user" is a site that gives out poor quality information, doesn't improve that information as a result of user feedback, and has a community that is positively hostile to feedback. Yes, as you point out, there were also helpful comments in that thread. There were also several other people who had the same problem I did, and the site never did anything to improve its content.

Isn't it obvious that this was a lousy tutorial? Doesn't the author of a site like this have any responsibility to put out useful information? Does the fact that I screwed up make my feedback any less valuable? If you're writing tutorials, that's exactly the kind of feedback you ought to be looking for, so you can improve your tutorials.

Contrast it with Arch Linux's page on user management:


> To add a new user, use the useradd command:

> # useradd -m -g [initial_group] -G [additional_groups] -s [login_shell] [username]

Here they do use the same useradd command that I was complaining about (probably because Arch doesn't include adduser by default?), but they give a complete example with all the options required, followed by an explanation of those options.

Now that's how you do it.

BTW, I was almost tempted to take offense when you said that I "should have gotten a computer-literate friend to help". Yeah, I guess I'm not very computer-literate. After all, I've only programmed in Algol-60, APL, assembler (Sigma 7, PDP-10, 6502, TI 9900, 8080, x86), AWK, Basic, Bliss, C, C++, COBOL, CoffeeScript, Forth, Fortran, Go, Java, JavaScript, Jovial, Lisp, Lua, Pascal, Perl, PL/I, PL/M, Python, Ruby, SIMPL, SQL, Trac, and a few other languages you wouldn't have heard of.

But we're all newbies at something, some of the time. In my case it was basic Linux user setup.

Maybe you even screwed something up once, as a result of somebody giving you bad information that you didn't thoroughly check out. (Not saying you did, just possible.) If that ever happened, which would be a better response from the source of that information: chewing you out thoroughly, or correcting the bad information?

And this is what's so frustrating. I love Linux. I truly love the idea of making everything open from the kernel to the desktop environment to all of the applications you can run on it. I truly love the idea of customizing everything... and if you don't like what's there, you can modify it to your heart's content if you learn how. That's just freaking cool to me.

Except... we have douchebags in the community who talk shit to newbies and then cry, "Why does everyone use Windows? How come Linux isn't more widely used?"

Just check your post history and read the number of times you wrote "RTFM." That's why.

Well, there's shit-talking the newbies, and then there's refusing to lower your system to the lowest common denominator by dumbing it down.

Not being careful of that second one gets us things like the GNOME 3/Unity/Windows 8 design-by-committee that seems to be loathed so much by power users.

Is it so wrong to expect some basic minimum level of competence in a system before someone uses it? And then suggest that someone take the most basic of steps to acquire that competence before asking questions? (By R'ing TFM?)

Systems should as a rule be somewhat intuitive (IMNSHO), but you can only take that so far before you start hamstringing yourself. Devs would never get anything done if they spent all their time training people how to use systems.

You're definitely right that there is a fine line between user-friendliness and "The user is an idiot" in UI choices. Right now I'm using Lubuntu; I dislike Unity immensely.

Remember that everyone starts out incompetent. They ask dumb questions, and in some cases entitled newbies expect the experts to bend over backwards for them. But one thing that's nice is that a lot of questions have been answered before by patient experts.

Say there's a guy on the Ubuntu forums who asks the following question:

"Hey, how do I change my desktop resolution?"

An asshole answer is "Fucking Google it, idiot. It's not hard."

A decent answer is, "Here's a link to the answer. In the future, I suggest Googling your issue. You don't have to wait for us to answer it! Often, questions have been asked and answered before. We're here if you get stuck, though."

This has a few benefits. The first (obvious) benefit is that the newbie doesn't immediately think "Wow. What a dick, I was just asking a simple question."

The second benefit is that future newbies who Google the same question get a link to the answer.

Of course, there are lost causes (Where's the Start button? This isn't like Windows at all. I don't like it) and asshole newbies who simply refuse to try to learn anything by themselves, (Ok, you answered my question on desktop resolution. How do I change my desktop background?) but I think these people are the exception rather than the norm. And they can be dealt with in a way that is constructive rather than crass and uncouth.

Try giving the ‘decent’ answer to the thousandth person asking that very question without any sign whatsoever of the work they put in. Or to the person who obviously did not read the rules for the mailing list they signed up to and instead ‘just asks a simple question’. There are, unfortunately, many more entitled people than you think.

Fortunately, you can easily killfile people in mailing lists, and doing so quickly (i.e. after one wrong post) helped me greatly.

> What you seem to be saying is that in order to use a personal computer, you should be an expert.

No. What I am saying is that a) using a device is different from administrating it and b) in order to administrate a device you have to be an expert. You don’t expect to buy a car and never ever go to a workshop with it, because, you know, you shouldn’t have to be an expert to service your car. Similarly, expecting to be able to service a PC without being an expert is as futile.

> Have you noticed that the most successful distributions of Linux are the ones that are most friendly to beginners?

What metric/definition of successful are you using? [0]

> (third paragraph)

Exactly. Your point being?

[0] Hint: Number of users is not a measure of quality, and hence not necessarily a measure of success.

I would argue that popularity is actually a great indicator of the quality of an OS and your experience with it.

I am what you would consider an "expert" (or at least, I can definitely competently administer a linux computer) but I don't use linux distributions for day-to-day computing. The reason is that linux computers, even ones running "popular distros" such as Ubuntu, are considerably more prone to interoperability foibles and installation/updating problems.

The reason popularity matters, even though I have the technical skill to administer any OS competently, is that popularity is the single best incentive to encourage third parties to iron out the kinks on a particular platform. The reason AMD drivers work better on Windows than they do on Linux is not because Windows is "better" by any of the metrics of quality you espouse, it's because Windows is more popular. The reason Steam for Linux is only supported on Ubuntu is not because Ubuntu is "better" it is because it's the most popular distro. The reason to use popular platforms is because all of the little things that require a half-hour of expert time to work around would have already been solved by the vendor who recognized that such issues are worth testing for and fixing for a platform with 100 million users, but not for one with 20,000. That's distasteful to some because it doesn't seem meritocratic and it's out of the control of the creators of the OS, but it's a fact of business, and it does affect the quality of an OS in tangible ways.

As for the "expert" thing, the costs really aren't that different for various kinds of users. The reason I care about this stuff is because the opportunity cost of my time spent solving menial issues that would have been solved by the vendor if I were on a more popular platform is too high. The reason my mother cares about this stuff is because the transaction costs of finding someone who will spend the time to figure out what's going on and fix her computer is too high.

> The reason popularity matters, even though I have the technical skill to administer any OS competently, is that popularity is the single best incentive to encourage third parties to iron out the kinks on a particular platform.

I prefer a working platform that forces me to read the manual beforehand over a platform where I don’t have to read the manual but random third parties iron out kinks.

Popularity is a measure of quality, but only of quality wrt to the usecases of the people among which the product is popular. My computer usage is very different from my mum’s, and likely from 95% of the population, hence, the popularity of a product among 95% of the population is rather irrelevant to me.

If you’re a single person with a tight budget looking for a car, you don’t look at car popularity among large families owning oil fields, but among people similar (in the relevant ways) to yourself.

Hence, popularity among a large user base is an indicator that the OS is optimised for use by a large part of the population. If you find that this is indicative of the quality you’re looking for, fine.

> I would argue that popularity is actually a great indicator of the quality of an OS

Windows is tons better than OS X then?

Windows, is widely distributed...not popular.

Couldn't you say the same about Ubuntu? It's far more widely marketed to end users than any other distro.

No, Ubuntu is chosen by its wise users, not "marketed".

So wise.

I was specifically using "successful" to mean "number of users." People use what they like, and Ubuntu has gotten a lot more users than, say, Gentoo. Is Ubuntu superior to Gentoo? That's a matter of opinion; I'm sure that there are plenty of applications where people have decided that Gentoo is the perfect fit for what they want to accomplish. But for personal computing needs, people tend to use what is most user-friendly.

I don't mean that it's okay to be completely illiterate in how a computer works, especially if you're using Linux. To take your car example, I don't think that a driver should know how to rebuild his engine. I do expect the driver to know how to change his oil and brake pads and know about scheduled maintenance.

The same is true for computers. A Linux user should know basic bash commands, how to install software, how to research problems, and so on. But he shouldn't be expected to know how to troubleshoot driver problems or know the ins and outs of xorg.conf. That's what Google and the aforementioned benevolent experts writing guides are for.

While I agree that we should do everything we can to be user-friendly and not allow what I'm about to suggest to take the form of "elitism", I think we underplay the importance of helping the user understand the system. I think your typical user should understand SSL before they do online banking and shared libraries before they let their work rely entirely on their computer, in the same sense that I think you should know how to change a tire and change your oil before you buy a car or go on a long road trip.

Yes, they all should just "work out of the box", but in a more and more literal sense we trust a lot of machine with our lives. We ought to understand all of these machines better before we trust them with our lives or vote on the leaders who will regulate their use.

We have abstraction layers precisely so that you can use things without understanding them. It's not just computers, every day we use all kinds of technology without properly knowing how it works.

Of course, sometimes those abstractions are leaky, and we have to change a tire or apply an update. But suggesting that typical users should read up on SSL (and therefore asymmetric cryptography) before they use online banking is as ridiculous as saying that every new driver should study the detailed workings of the internal combustion engine. Someone needs to know how it works, but my mother really doesn't.

I've personally seen people click through the most alarming-looking certificate errors and start entering their bank login credentials. I've driven in cars with people who had put up with that knocking sound in the engine for months because it wasn't getting any worse. If your mother did any of these things, I would be very worried about her financial and physical safety.

You don't have to know how either works to recognise something is wrong with them. My mother, happily, is quite cautious and would go and find someone who understands the machine better. In the computer case, I'd likely be getting a phone call.

The certificate errors are kind of a design failure. Users are so used to clicking through messages they don't really understand, and this is just one more. Even the most alarming certificate errors are usually innocuous, in my experience - someone forgets to renew a certificate, or fails to get a new one for a domain change. That doesn't mean it's OK to ignore them, but we really need systems that don't cry wolf unless there's a really good chance that there's a wolf.

> Even the most alarming certificate errors are usually innocuous, in my experience - someone forgets to renew a certificate, or fails to get a new one for a domain change. That doesn't mean it's OK to ignore them, but we really need systems that don't cry wolf unless there's a really good chance that there's a wolf.

How do you propose to algorithmically determine if a certificate not matching a domain change is innocuous? This is a bit like suggesting having a "check engine" light that only comes on when there's an immediate risk of an engine fire.

For starters, there are a whole load of conditions that are no less secure than a standard http connection, such as self-signed certificates. But in those cases, the browser throws up a huge and misleading warning about it being insecure. Insecure is the default on the web: we should focus on positive signs that a site is secure.

Secondly, the system should make it harder for the administrator to make mistakes. For instance, the web server could refuse to serve https on a domain that it didn't have a certificate for. Or when a certificate nears its expiry date, the administrator should be getting plenty of reminders about it.

Thinking bigger, what if we tied it closer to the sensitive UI? What if only pages loaded securely could show a password field? What if rather than typing credit card numbers into boxes on a webpage, we were used to using a special browser interface that would only light up on secure pages?

Malicious intent can be algorithmically determined by checking for packets with the "evil" bit set to 1, as specified in RFC 3514 [1].

[1] http://www.ietf.org/rfc/rfc3514.txt

That is a horrible specification.

"Multi-level insecure operating systems may have special levels for attack programs; the evil bit MUST be set by default on packets emanating from programs running at such levels. However, the system MAY provide an API to allow it to be cleared for non-malicious activity by users who normally engage in attack behavior."

This should say that the system MUST provide such an api. Otherwise, a you can find a situation where the reciever may have to allow evil packets to pass, because they are sent by a user who has no option of setting the evil bit to 0 for non-malicious activity. Kind of defeats the point.

"Fragments that by themselves are dangerous MUST have the evil bit set. If a packet with the evil bit set is fragmented by an intermediate router and the fragments themselves are not dangerous, the evil bit MUST be cleared in the fragments, and MUST be turned back on in the reassembled packet."

This places an undue burden on intermediate routers, as they now have to parse the packets and determine if a fragment is evil.

"Intermediate systems are sometimes used to launder attack connections. Packets to such systems that are intended to be relayed to a target SHOULD have the evil bit set." These packets are not 'evil' in any technical sense. They are only a request for another computer to send evil packets.

"In networks protected by firewalls, it is axiomatic that all attackers are on the outside of the firewall. Therefore, hosts inside the firewall MUST NOT set the evil bit on any packets."

This is just stupid. If I am attempting to send malicious packets from within the firewall, I am already required to set the evil bit. Now I am also required not to. If I am sending a non malicious packet, I am already required to set the evil bit to 0.

Also, If I am on a computer behind a firewall, and attempt to send malicious packets to a computer outside of my firewall, I am still forbidden by this to set the evil bit. This will force attackers to either, break the spec, or give up their own firewall.

"Because NAT [RFC3022] boxes modify packets, they SHOULD set the evil bit on such packets. "Transparent" http and email proxies SHOULD set the evil bit on their reply packets to the innocent client host." I must be mis-understanding this one, becuase non of those uses seems evil.

"Devices such as firewalls MUST drop all inbound packets that have the evil bit set. Packets with the evil bit off MUST NOT be dropped. Dropped packets SHOULD be noted in the appropriate MIB variable." Dropping packets is a technichal inevitability, and a crucial part of TCP's anti-congestion stragety.

Just becuase the evil bit sounds like a good idea, does not mean we should go ahead and implement it without carfully looking at the technical implecations of the spec.

The problem with the car analogy, as TallGuyShort points out, is that cars are not capable of having as strong an influence over our lives as computers are. When I think about how this used to be done before, usually finances were done over phone where people got to talk to other people who explained to them things that they did not understand.

I agree that learning about SSL is probably overkill; but we do need more computer literacy among the general population. Think of it like this: finance is something that is hard, yet people learn about the basics so that they can keep their money safe. Basic computer literacy should have a similar status as a skill that is required to, say, keep your information safe.

I think the analogy is better than you are giving credit. I would say cars can in fact have much stronger influences in our lives than online banking can. For example, cars are much more likely to kill you. It's a much more important skill to recognize when your brake pedal is getting squishy so that it doesn't fail when you are careening down a mountain road, or recognize that an oncoming car is swerving erratically, than it is to realize you are logging in to an insecure phishing site. In the former case you end up dead, in the latter case your checking account gets wiped.

To clarify, when I say "learn about SSL", I mean people should know how to tell if the connection is "secure" or not, and that secure ensures that there is an extremely high probability that the domain they are talking to is who they say they are, as verified by some people that Google / Microsoft / Mozilla decided were trustworthy. Similarly, I think people should understand that emails can travel unencrypted between sites and are easily spoofed. I'm not saying they have a working knowledge of the RFCs for SSL and SMTP. I agree with the way you worded it - perhaps I made it sound too involved.

edit: another, perhaps more humorous example. How many times have you heard about people "hacking" their Facebook accounts and getting "viruses" on Facebook. I've been asked by multiple people to fix their computers, when the real problem was they had no idea what they were doing on the Internet.

So, why should I use Linux rather than Windows, which will run all my software? Expecting everything to work perfectly may be nonsense but I have to assume that your average user is going to follow the least effort path.

I never said that you should use Linux or Windows or Mac OS or iOS or whatever else you come up with.

Use the product you feel is best suited to your task. If that is Windows, use Windows.

Edit: That said, you will likely want someone knowledgeable around when setting up Windows or Mac OS X, too, so I don’t really get your point.

The last version I had any trouble with, by way of Windows, was 98 on a Tiny computer I had as a kid, mostly because of esoteric driver requirements. After that, it pretty much just worked. What do you think your average user is likely to do to it that would need them to go whistle for a sysadmin?

I don't know about anyone else, but I still go looking for an expert when I need any of my Windows machines fixed: whereas when any of my Ubuntu machines have issues, I'm comfortable and motivated to look up the answers myself. Partly that's an issue with documentation quality - you're a lot less likely to catch a drive-by download or follow maliciously bad advice on Linux forums - but I also find the workings of Linux much easier to understand.

"help with windows", 2.5 Billion, with a "B", hits. Do not pretend 100% or even a large percentage of Windows users never need power user help at a minimum.

The most popular desktop operating system on earth returns 2.5 billion hits. So what? Large numbers without basis for comparison are meaningless. I can't think how you'd get that basis here. The only thing you can really say on that basis is that less than 100% of users had no problems.


I didn't pretend that people never needed help with Windows. I said I hadn't had any real problems with it since 98 and asked what he thought the average user needed a sysadmin for in Windows.

I can see how you'd come to the conclusion you seem to have, but I really don't appreciate being made out to be a liar. If you act on a similar assumption in a similar manner in the future then I'm going to ignore anything else you have to say.

What does seem relevant to me is that there's a dramatic difference between an operating system that even programmers get tired of trying to make do what they want and jump ship from and one that you can reasonably expect not run into significant problems with for years. If Windows were equally messed up as Linux I'd expect to run into the problems at around the same rate.

How long does it take you to run into trouble with Linux? I tried it just now, downloaded Mint and threw it in the old DVD drive - 15 minutes. Most of which were spent trying to work out how to change the primary monitor so I could get the taskbar onto the one I wanted. When I did manage to do so the graphics corrupted and the system froze irrecoverably.

You clearly aren't the "computer whiz" in your family, who has to field all the "can you help me? Microsoft isn't working. I think I've been hacked, do you think it's a virus? Why is my computer so slow? Why is fixing it taking so long?" issues.

As for your ubuntu issues - you booted off a liveCD. Good for you. Now, how many liveCDs have you seen of Windows? How widespread are they?

Anyway, the same problems apply in setting up the system regardless of Windows or *nix - you need a power user to get you through the humps. Good luck installing XP as a normal user when you run into the ever-present problem of no appropriate network drivers. Basically you're comparing apples and oranges.

> You clearly aren't the "computer whiz" in your family, who has to field all the "can you help me? Microsoft isn't working. I think I've been hacked, do you think it's a virus? Why is my computer so slow? Why is fixing it taking so long?" issues.

There's a limit of naivety beyond which everyone will need help, yes. I simply believe that you can get away with being more naive with Windows most of the time. I don't have to fix my mother's Windows problems, or my sister's - from time to time I have to fix my father's, but even then not just for setting the thing up.

> As for your ubuntu issues - you booted off a liveCD. Good for you. Now, how many liveCDs have you seen of Windows? How widespread are they?

I think I've seen two or three versions over the years, mostly as repair tools environs. Why?

> Anyway, the same problems apply in setting up the system regardless of Windows or *nix - you need a power user to get you through the humps. Good luck installing XP as a normal user when you run into the ever-present problem of no appropriate network drivers. Basically you're comparing apples and oranges.

Windows Xp network drivers - that's three versions out of date. And really, where are your discs? If you got it ready made they should have sent you the drivers along with their bloatware and if you didn't then they should save sent you the mobo drivers with the mobo.

To be fair, Windows 7 typically still has that network driver problem with new ThinkPads. Fortunately, ThinkPad is a good company, and they make it relatively easy to locate and download the right network driver to put on a thumb drive, if you don't have an install disk handy (happens more commonly than you suggest). This is easy if you know what you're doing, but, of course, next to impossible without some power user knowledge (it also requires access to another computer with a working internet connection and a thumb drive).

And also to be fair, XP is in a valid sense only one version out of date. Vista was a fiasco, and that's not just a meme -- Vista was released way ahead of schedule and still had a ton of known unresolved bugs. It was by far MS's most buggy release, and the whole thing was a disaster. Compared to the relative stability of XP, Vista was a step backwards. And while Windows 7 can rightly be called a newer version of Windows XP (and is better in nearly every way), it would be a stretch to say that Windows 8 is a newer version of Windows 7. Windows 8 is a fork into a different paradigm, and is so different it's not even comparable to Windows 7 along a number of dimensions.

All that said, XP is still pretty damn out of date, even with the latest service pack installed, and if Windows 7 is an option, it's usually a better one. Of course, XP uses less memory, so it may be suitable for particularly old computers. And for people that don't like to pirate software, owning a copy XP is a great reason to stick with XP (if it works well enough, why fork out the money for Windows 7?).

"If Windows were equally messed up as Linux I'd expect to run into the problems at around the same rate."

My personal experience (as someone who has been dual-booting Windows and Ubuntu for nearly a decade and uses both operating systems regularly) is that both have a lot of annoying problems, but Windows has always been way more of a pain in the ass, and I've run into more problems using it. Especially when it comes to driver issues, Windows has problems (it took hours to get a standard WiFi card to work, and once my video driver got corrupted so badly the system wouldn't even let my try to fix it, and restore failed -- I had to reinstall the OS to fix it).

Will this be fixed in an upcoming ubuntu or skype update?

Both bugs appear to be things which a future ubuntu or chrome/skype update should fix. Please correct me if I am wrong.

Linux haters can't stand that Ubuntu can release new OS versions so often. OTOH, I'm tired of the chase and from now on will just stick with the LTS releases. If I were a some years younger though, I'd be there in the thick of every release. Kudos to the makers and the doers of Ubuntu. RedHat, where were/are you.

I have noticed that Ubuntu haters are not Linux haters at all, they are previous Ubuntu users, or simply fans of other distributions.

P.S.: RedHat is selling server versions and support for millions of dollars. Ubuntu still is not profitable despite all they have done for the community. It's unfair to me.

I dunno what happens if you install them after you upgreade, but google-chrome and chromium still work for me, as I installed them long before this upgrade. :)

With chrome it is just in packaging problem - thus chrome is not open source nobody can fix it on their own.

I'm using Chromium from the Ubuntu repository, which is good enough for me until they get it sorted out.

Sounds like you could also use the dev channel Chrome.

I've upgrade a couple of days ago. Skype works fine, I'm using version, not sure if it was upgraded during the process.

Chromium works fine.

Regarding google chrome it has been fixed on the unstable channel for a while now.

This is why I don't use Ubuntu anymore. Every single release they roll out has serious issues interrupting a normal user's workflow.

Ubuntu, if you're reading this: please consider testing your releases more thoroughly.

Maybe Canonical is releasing bugs on purpose to compete with Microsoft.

VMware tools will also not install.

any knows fixes for spotify ?

I don't recall needing any fixes for it, but it crashed randomly enough (and not just on Ubuntu) that I switched to their web client.

I played with the beta and I have to say that it is way more polished than 12.04/12.10. I have played for many years with Linux (since early 1990s) but it was always to much of a hazel to use it as my main os. But since 12.04 I fully switched from OSX to Ubuntu the only thing I kind of missed was the polish of osx. 13.04 makes lots of things nicer and more fluid. I actually like Ubuntu better than osx. One thing left to do is to get some descent icons and getting rid of the default orange :)

It's weird, I've been on Linux for almost 20 years and Ubuntu as my main computer for 10 and for me it's the opposite. I find OSX has finally almost caught up to Linux with Mountain Lion.

It's crazy how many OSX features come from the unix world but in a delayed, better looking fashion.

The last missing major piece for OSX was good multiple workspace management, which came in the form of mission control and the touchpad gestures that makes it so efficient. I don't know if its any good with a mouse though.

Efficient multiple workspaces is a critical productivity feature for me. It's like having multiple monitors without having to carry a desk and a bunch of monitors with me all the time.

The previous OSX implementation called 'spaces' took too many actions to switch workspaces unless you had your hands on the keyboard and had setup shortcuts. This made OSX unusable to me.

Unfortunately, Ubuntu also regressed by getting rid of the single-click workspace switcher (which was working beautifully since like 1995) and doing a multi-workspace UI that resembles Lion's 'spaces'. However, I was able to remedy this by setting my middle mouse button to bring up the workspace switcher which is really fast.

The other Ubuntu regression which annoys me to no end and again is the same in OSX is that clicking on a dock icon when you are in a workspace and there is an instance of the app opened in another workspace makes you automatically jump to the other one instead of opening a new instance in the current workspace.

This assumes I separate applications on a per workspace basis which is stupid. I already have a dock to get access to different applications. Workspaces are used to separate projects or workflows.

Each workspace needs its own browser window that holds the tabs related to the project being worked on and also its own terminals and text editors. I don't want to be yanked out of a context because I tried opening an editor and there wasn't one already in the workspace.

Hopefully this is eventually fixed in both OSX and Ubuntu.

> The other Ubuntu regression which annoys me to no end and again is the same in OSX is that clicking on a dock icon when you are in a workspace and there is an instance of the app opened in another workspace makes you automatically jump to the other one instead of opening a new instance in the current workspace.

If you middle click the icon it will open a new one instead of taking you to an existing one.

I didn't know that. Thanks for the tip. However, since I set my middle button to go to the workspace switcher there's a conflict.

I just tried it and it does open a new instance _and_ go to the switcher which is not great.

Also most of the time if there is already an app window in the workspace, I would want it to be made visible not a new instance to be opened.

You can also hold down shift and click on the icon to open up a new window.

Similarly if you use a keyboard shortcut to open your icons (like Win+1 to open the first icon) holding shift while pressing the shortcut will open a new window (Shift+Win+1 will open a new window for the first icon).

I've always just used ctrl-alt-arrow for ubuntu (and other nixes)- add a shift and the current window comes along for a ride. When did this break? (I'm still on 12.04 LTS)

If your hands are on the keyboard that's fine but I also need a good solution for when I am using the pointing device.

Do you regularly use the pointing device with both hands?

How would you press Control-Alt-KeyLeft with less than both hands?

On a standard 104-key Windows-style keyboard (or the 101-key standard precursor, or anything with a generally similar layout), I'd use the right-side Ctrl, right-side Alt, and the Left arrow in the arrow key array (not the numeric pad). Should be pretty easy to one-hand even with fairly small hands.

For e.g. VT switching, the right-side modifier keys won't work.

it's possible on my laptop keyboard as the alt/ctrl are right next to the arrow keys, but admittedly it's not a particularly natural or comfortable way to do it

You can swipe to a new space.

Speaking of which, one thing that most people coming to ubuntu from windows and OSX often don't spot (which I think you have, but I post mostly for the benefit of other readers) - Ubuntu, much as canonical has done a lot of custom work on it, is still linux. You can customise it in pretty much any and every way that you want. You can even not use Unity at all if you don't like it -- no reason to throw out all the great foundations and switch to a different distro just because you don't like one of the default UI settings :P

(There are several valid reasons for switching from one linux distro to another, it just makes me sad to see so many "I don't like Unity so I'm moving to Gentoo" posts -_-)

It is still possible to customize Ubuntu, but you cannot customize Unity itself. It is no longer the goal of canonical. Canonical is trying to make an OS for an average person, who doesn't care about customization. Power users (ie normal GNU/Linux users) are not their target audience.

I went through a phase in which I wanted to be able to go to great lengths to customize the UI. Lately I've realized that in the end, virtually all of my "power user" use cases are served by the terminal and emacs. When it comes to the UI, I just want something with reasonable behavior, that allows me to launch apps, and generally stays out of the way. Now days when I set up a new linux machine it's mostly just a matter of doing the default install, using aptitude to get the same old utilities, languages, and apps that I always use, and copying over whatever I want to keep in ~/.bashrc and ~/.emacs

I think everyone goes through that phase. I use spend hours customizing my desktop. Finding the perfect image, icons, size of the task bar, setting up some crazy conky displays, etc... Now it's i3 with a plain black background.

I'm using i3 as well and am loving it.

Same. Terminal + Tmux + Vim + launcher (Gnome-Do or Synapse) + Vim keybindings for everything (browser, anything else) = couldn't care less what desktop I'm running (minus privacy concerns like embedded Amazon searches).

I fully agree. I just want a desktop that stays out of the way. I have ubuntu and I have replaced unity by cinnamon. It is not very polished, but in a couple of minute, you can configure it to be usable.

The desktop that does the best job of staying out of your way is no desktop. I've been using Ubuntu for years without a desktop, just the dwm window manager. Now and then I ask people, because I know the desktops are evolving, what they might have to offer me, but I never get a coherent response.

I really like enlightenment, but the wifi connection manager in e17 baffles me. I've had it working, but it seems to want me to install something else- do I then take out network manager? etc etc.

Unity has a huge flaw, though: to find a program you have to know its name. That sounds silly to non-nixers, but for instance some are known as one thing but actually are named something quite different (e.g., 'Document Viewer' = 'Evince'). In cases where unity doesn't recognize both names this can be a problem.

I used to learn the command-line invocation by looking at the shortcut in the menu. No more in unity (at least I haven't learned how).

I would take a punt and suggest that the average punter couldn't describe what a desktop is. I'd struggle to define it. But think it includes: application launching, application management, window management, file management and a couple of helper applications: clipboard, network manager, notifications, workspaces and probably some input device management! It's no wonder why people give incoherent responses!

Same here. I usually use OpenBox then add a few commonly used programs to the menu. I often wish this was possible with Windows (might be with 3rd party software) but I understand that they have a different type of user in mind. There really isn't a need for a desktop though.

dwm is great, but I like twm slightly better due to its more conventional window management.

It's also extremely satisfying in a mischievous sort of way whenever I walk into some public place and pull out my T410 with that setup. I can't count the number of stares I get. One guy even asked me why I used a laptop from the 1990s.

Heh. You should see the looks I get with my Panasonic Toughbook CF-30 (running Arch Linux and i3wm). People must think I'm entering nuclear launch codes every time I use vim in public on the thing...

I also use the Thinkpads - great with Linux, and I love the way the keyboard feels. I haven't noticed the stares, though.

> no reason to throw out all the great foundations and switch to a different distro just because you don't like one of the default UI settings

Installing Cinnamon on Ubuntu was broken, and many things didn't integrate like they did with the default install, which is why I just switched to Mint in the first place rather than screwing around with UI settings.

I got fed up with the DEs and switched to #! after about 2 months.

I still think it's Openbox, Wmaker, or die.

If you don't like orange, take a look at Lubuntu 13.04. It has nice minimal icon set and is blue instead of orange.

One could first try installing Advanced Settings from the Ubuntu Software Center and playing around with the GTK+ theme and the icon theme.

Problem I've found is that there aren't any simple GUIs for creating/tweaking GTK3 themes. You could configure the colour scheme slightly (5 colours) with Gnome 2 and the older GTK engine. The other thing, is that you still have a mix of GTK3, GTK2 and non GTK apps, well at least I think that's true. OpenOffice being an example. It's an annoying problem for any distro, but the end result is that you end up with loads of inconsistency.

In short it's more of a pig for a user to configure their desktop style than it was in Windows 98.

So yes you could swap your theme, but I have yet to find many professional themes, and even the better ones like Bluebird, have issues.

Yeah it's fugly by half (Ubuntu that is). And I once even liked the browns! I prefer the subdued themes. I'm almost happy with Bluebird and Xfce, but I still have theme/colour scheme issues. I'm forever hoping for a pretty and consistent interface throughout Ubuntu.

if you want nice icons i can only recommend faenza from tiheum, he also has a ppa.

http://tiheum.deviantart.com/art/Faenza-Icons-173323228 https://launchpad.net/~tiheum/+archive/equinox

"too much of a hazel"? What does this mean?

I suspect he meant "too much of a hassle." I like his version too.

In case anyone else was wondering what the difference was between the Mac and non-Mac releases is:


I've been using it for a month and haven't regret a single second of it. by the way, Nexus 7 and other MTP device users: plug your device in and enjoy full read write support by default :)

Wow, thats a great new feature.

This is one thing that really infuriates me about new android devices. Their lack of syncing support with linux boxes.

Does anyone know if this support is something that can be backported to 12.04?

I had some success following this[1], though it didn’t seem to handle transferring lots of files at once very well.

[1]: http://www.webupd8.org/2013/01/upgrade-to-gvfs-with-mtp-supp...

Thanks, that was an incredibly helpful link.

Adding support was pretty trivial in the past, install a package and edit a config file. Just google your device + ubuntu and you'll more than likely find instructions.

> Nexus 7 and other MTP device users: plug your device in and enjoy full read write support by default :)

Is browsing/copying laggy or slow at all, like on Windows 7?

What is this MTP crap? With my Nexus One I could plug into any Windows or Linux machine and transfer files. With my new Nexus 4, uhm nope. Not without hassle. It sounds like MTP doesn't work all that well and you have to upgrade to the new Ubuntu get it. I would bet that the next Google series of phones won't allow any such mounting. Maybe not even any local storage on the phones. Everything on their servers, their "cloud". It's starting to piss me off.

The Nexus One had removable storage. You connected it, Android unmounted the SD card, and the desktop mounted it - boom no worries.

The Nexus 4 cannot unmount its internal storage, so something like MTP is required.

That makes sense, still I think the removable sdcard was the way to go, I had more storage at 32 Gigs with my own sdcard before, now with Nexus 4 I can't mount it and it's limited to 1/3 the capacity.


Now what about releease notes? :-) (edit: some information is here https://wiki.ubuntu.com/RaringRingtail/TechnicalOverview )

Currently i am thinking of switching to LMDE (Linux Mint Debian Edition) because i don't really like the way Ubuntu is going with Unity, Mir, upstart, doing too much stuff on its own.

Anyone with experience from LMDE here? Is it stable? Pros? Cons?

I've switched all my desktop/laptop installs to Xubuntu. All you need to do is "sudo apt-get install xubuntu-desktop" and you get a nice, fast, clean desktop. It's also easy to switch back to Ubuntu and, as it's just a remix of Ubuntu, 99% of all the packages just work, support documentation is still valid and practically shares the same user base.

Xubuntu is still lead by Ubuntu though and it currently uses upstart. I'm not sure if Xubuntu will use Mir in the future.

I second Xubuntu. Still the distro I use on my linux machine to this day. I actually tried Linux Mint Debian Edition. I think the two main things that turned me off were its inferior support for particular third-party programs (I think they were Skype or Wine), and its general feel (font and window management seemed rough). This was about a year ago I think, so YMMV

Xubuntu here as well, it runs great without causing you any hassle, like Ubuntu with Unity does. The UI doesn't change a lot and you know exactly what you're getting.

The parent comment made me check out the official site.

The top feature on this page is a deal-breaker for me: http://xfce.org/about/tour

A Linux distro with no local man pages? Seriously? I sometimes use my laptop in places that don't have wifi. Not to mention the fact that the online manpages may become out of sync with upstream, or that the online manpage may describe a different version than what you have installed locally, e.g. if you're using an LTS version that's years old...

I tried stock Mint for some time, but found it buggy. I was a longtime Ubuntu user before that, having switched for reasons similar to yours.

Since then I've moved on to CrunchBang [0], which I absolutely love. For me, it hits the sweet spot of having everything I need without all the extra pizzazz (cruft?) of something like Ubuntu or Mint.

[0] http://crunchbang.org

It was buggy when Cinnamon was new, but now it's really quite stable and proves to be no more or less buggy than stock Ubuntu or any other distro. There were bizarre memory leaks in Cinnamon but those were fixed a long time ago.

There is an ongoing issue with Mint Update, but I'm aware a patch has been submitted. In the meantime I just run sudo apt-get update && sudo apt-get dist-upgrade -y && exit` every day. (For whatever reason kernel updates get held back if apt-get upgrade is run instead of apt-get dist-upgrade.)

Mint is awesome and one of the better distros out there. Linux Mint 15 looks like it's going to be a decent improvement too.

Why switching to linux mint when you can replace unity by cinnamon while remaining on ubuntu ? Except the desktop, what are the compelling advantages ?

Doesn't Ubuntu insert amazon affiliate links into your search results while Mint doesn't, or was that removed? Or does Mint just do it as well?

That is and has always been a unity feature (or 'feature') - unity-lens-shopping. It's not even installed if you don't install unity, e.g. if you use xubuntu, kubuntu, lubuntu, etc.

Sure, but it's enough to make me avoid Ubuntu all together on principle.

Mint doesn't do that. However, the affiliate advertising package is easy to remove from stock Ubuntu..

Last time I tried, Cinnamon proved to be exceptionally buggy when running on stock Ubuntu. Also, Mint's default aesthetic is better IMO!

I've liked ubuntu and actually still use it.

I started using #! sometime ago. I agree with the lightweight aspect of it very much. One session on openSUSE on KDE reminds me of how lightweight crunchbang is :-D

For people already on non-ubuntu debian systems, I would recommend taking a spin on it.

I switch to Korora for my laptop. It is basically the Mint of Fedora. Quite happy so far.

My feeling is that Fedora has the better engineering, but Ubuntu has better polish. Korora puts some icing on top: good font rendering, Adobe Reader, Skype, nvidia drivers.

Fedora stays closer to upstream than Ubuntu. They just package Gnome 3 for example.


My experience with Mint less than a year ago was that the Cinnamon interface was so bugridden I actually looked forward to going back to Unity. The only real bug I've encountered in Unity is the integration between it and LibreOffice that renders menu keyboard shortcuts inoperative (Alt+F doesn't open the File menu, and so on). It's a known problem but the Ubuntu, Debian, Unity, Gnome, LibreOffice and OpenOffice people all keep passing the buck and nobody is considering it a high enough priority to get in there and fix it.

If Unity had a functional Alt+Tab interface (swapping between open windows instead of open applications) I'd have no further complaints at all. I've grown used to it. Can't wait to see how 13.04 improves things.

Alt+Tab is done the same way in Gnome 3, that is, if you don't know about Alt+`, you're going to have a bad time.

Honestly I try to use E17 whenever possible (there is a bug against vmware player that makes it impossible to use them together at length, virt-manager and kvm-spice is only so helpful in comparison, but that means for me it's mostly E17 and some time in another system with Windows loaded in a VM)

In E17 the Alt+Tab behavior works as described (within the current desktop only, to keep large numbers of windows manageable... this may be an option, if you want Alt+Tab to switch between all open windows, there are a lot of options)

I know it's fashionable to use the desktop environment that comes with your distribution/pick a distro that has the DE you want, but I like to push E17 at every opportunity, it can usually be installed in any distro without too much trouble, and especially now that you can't claim it will "never be released" there should be stable packages in your modern repos :)

I don't care too much about the next Ubuntu release personally, I am waiting for eLive Gem ~3.0 (whatever version is next) on Debian Wheezy which should be out soon!

I used Cinnamon for a while, and occasionally the desktop would freeze and I'd have to restart X.

Cinnamon has gotten much better. You might want to give it another shot.

I was on LMDE a year or two ago. I still found it a little 'one-step-removed'. Cinnamon is in the Debian repos now - just use Debian testing or unstable.

rmadison claims that it is only in sid, which is an important distinction, as current testing will soon be released as Wheezy.

A good point, though for most techies, running sid/Unstable should be alright on their desktop.

But so should testing – it does offer some stability and the remote guarantee that packages are installable, plus you get upgrades to stable for free at each release.

Then again, YMMV and everybody is probably better off making their own decision here :)

I'm using openSuse. It's fast and stable. Ubuntu Gnome ( http://ubuntugnome.org/ ) is also a good choice, but its not pure gnome as always.

openSUSE: Did you have problems with speed? Or nepomunk? I was eager to have a go when it came out (12.3), and has been a very raggedy experience for me so far. nepomunk constantly crashes, the system is veeery slow (on KDE at least) etc.

I'm on 12.3 with KDE, the only problem so far was making it remember the dual-monitor setup I was using. Apart from that, all good. A nice improvement over Ubuntu 12.04 with which I had a lot more problems.

I've switched from Ubuntu to Bodhi and then to Gentoo. Much more enjoyable experience for my power-user needs.

Used it sometime ago. It was good. Sane. Beautiful with quite normal decisions regarding what users want/need, contrary to Ubuntu's what we want users to want/need. Stable. Loved the way it worked just out of the box.

And the best thing was it didn't depend on Ubuntu. I said this because I liked even Mint Desktop better than Ubuntu which it was based upon.

Since then I've switched to some other OS.

Just a word of warning to anyone who may be thinking of installing this release on a MacBook Pro w/retina. I don't believe there is a window manager available that will play nicely with high DPI. I do hope that the linux community is planning on better supporting very high resolution displays.

13.04 works absolutely wonderfully, on my 2012 MacBook Air. Everything works perfectly for me on that machine.

How high resolution? My laptop looks great and works perfectly with Ubuntu @ 1920 x 1080. What doesn't work?

What he's referencing is displays that are meant to display the UI at twice the pixel density. It will run just fine on the macbook pro with the retina display, but it will run at a 1:1 density instead of a 2:1 density (which is what it's designed for). Here's a little image I drew quickly to illustrate: http://i.imgur.com/4gizfp0.png

On that screen size, OSX will appear like the top image on a non retina display, and like the middle image on a retina display. Ubuntu will display like the top image on a non retina display, but will appear like the bottom image on the retina display.

As others have mentioned, the 2880x1800 resolution is unusable with a 1x1 pixel display. You can change this with "xrandr --output DP-2 --scale .5x.5" and that will cut the resolution in half, but everything is very fuzzy.

http://en.wikipedia.org/wiki/Retina_Display — we're talking in the order of 2880 x 1800 for a 15" screen and 2560 x 1600 for a 13" screen. Linux window managers react by making everything tiny, which is not ideal.

See http://www.phoronix.com/scan.php?page=article&item=apple...

2880x1800. On a 15.4" screen, mind you...

As if that wasn't enough, the 13.3" version is 2560x1600. Which is completely insane.

Running Ubuntu on a regular Dell laptop that is sometimes docked in a dual-monitor setup has been quite frustrating after versions 10 or 11 or so. I've had issues, repeatedly, with either setting up Unity for dual screen, or getting X to work at all. Going back to plain Gnome served as a partial workaround, until I switched to the new laptop - and then I started having issues initializing the screens after logging in (one of the screens would randomly remain blank when going into 1280x1024, I have to repeatedly logout/login to make it work).

Eventually, I gave Kubuntu 13.04 beta a try and, surprise, it works perfectly. No Unity nonsense, X works great.

It's a bit strange switching to KDE now. I've actually used KDE 1.0 back in the day, and Gnome 1.0. I've been a Gnome user most of this time. But now I'm starting to find even plain Gnome quite frustrating.

The same problem here. If I start it docked with the lid closed, it doesn't get past the Ubuntu splash screen and hangs in some weird state where the only possibility is hard power-off. I have to start it with the lid open, then close it after logging into desktop and then manually setup the second external monitor, because although it detects it, it keeps it "disabled". This entirely sucks in 12.10 I hope 13.04 will be better, but I'm little reluctant to upgrade so early after upgrade to 12.10 broke my system.

It does feel slightly odd to (finally) have arm listed as an available architecture for the images. And I say that as someone who has run Ubuntu on my "Android tablet".

I guess I'm just not yet used to "proper Linux" being something you can download for embedded-type architectures and devices without heading to XDA first.

Still. Definitely Good to see Ubuntu expanding support to the sort of devices which are actually new and interesting.

That's been a common sight for Debian users for ages :)

http://www.debian.org/ports/ http://cdimage.debian.org/debian-cd/6.0.7/

Ubuntu's had ARM images for going on like 4 years now!

Poor GPU support (lack of drivers) is still a major deal-breaker for home and media use. As of now hardware should still be selected on the basis of available driver support and it's a major obstacle and the situation doesn't seem to have improved significantly since 3-5 years ago.

I suspect it can't improve much more without a paradigm shift. That could be OEMs shipping Linux pre-installed along with working hardware and the necessary drivers. Or it could be having a better abstraction so that Nvidia & ATI can keep parts of their drivers closed without the regular version mismatches with the open source world. I think that's part of Canonical's plans for Mir.

The only problem I have is with the Nvidia drivers, especially Optimus/separate video card on my laptop which doesn't work. :(

But the stock Intel driver works well enough, just don't want to figure out what sort of CLI cargo cult fixing it will take to get it going.

Have you tried using bumblebee[1]? I use it on my Samsung QX411 which has an NVidia GeForce GT 525M video card along with the onboard Intel. The Intel card is used by default at all times. If you want to run a program using the NVidia card, just prepend 'optirun' to the command.

[1] https://wiki.ubuntu.com/Bumblebee

I tried bumblebee it on 12.10, after unsuccessfully trying to install Nvidia drivers. It was...not a good experience. Ended up reinstalling 12.10 just to get Unity and a reasonable resolution back. I have no doubt that there was a fix (possibly involving xorg.conf editing) but I had already spent so much time it and I needed to get work done. Hopefully the new release is better here, but I don't hold much hope.

I'm not blaming bumblebee specifically here, it's just a pervasive side effect of Linux and the modular approach. I can really appreciate the integration that goes in to a Mac.

I'm sure experiences vary greatly depending on the mix of hardware. I remember having to tweak some things to get bumblebee to work properly (or maybe I had to recompile from the source) but it's worked flawlessly since. This is on Ubuntu 12.04.

pkolaczk does bring up a good issue about the DisplayPort not working properly though. I don't use an external display on my laptop and have never played with that.

On another note, I remember installing 12.10 on my desktop right around the time it came out. My desktop has a Radeon HD3870 video card which was completely incompatible with Unity. I don't remember the issue exactly but my choice was either to use the open source drivers or have the desktop environment fail to show at all with ATI's drivers. I just did a search and I guess the fix is to downgrade X-Server and install a legacy driver. It's a shame something as central to the user experience as GUI performance still doesn't work out of the box or worse yet, critically breaks on an upgrade.

For me bumblebee works, but then the DisplayPort is useless. It would be awesome if bumblebee could automatically detect that the laptop is docked and turned on the external graphics to enable proper functioning of DisplayPort. Or even not automatically, but simpler than uninstalling/installing bumblebee (and fixing the mess it did to config files everytime).

All the hardware I purchased in the past several years works with most GNU/Linux distros. There is some minor problems sometimes, but usually they do not force me to purchase hardware specifically for it.

My new GTX card works great, Steam games (that have been ported to Linux), Minecraft (awesome!), all run great?

The minimal CD can be found here:


Always use that instead of the official desktop CDs, to avoid all the extra programs that I never need.

I was looking around youtube for unity videos and found Richard Stallman talk about Ubuntu: http://www.youtube.com/watch?v=CP8CNp-vksc

Why do they still recommend 32 bit for everybody?

I have been using 64 whenever possible. I wonder if any weird behavior can be attributed to it?

Last I looked, they had ~30% of users running on 64-bit incapable hardware, and they don't want to ask techincal hardware questions on the download page. For example Intel Atom CPUs only got 64-bit relatively recently, the earlier netbook generations were 32-bit only.

Edit: found the source. It was actually 25%, a year ago. And the other reason was that the error message when trying 64-bit on incapable hardware was very cryptic. https://lists.ubuntu.com/archives/ubuntu-devel/2012-April/03...

A few drivers and Flash used to have issues with the 64-bit versions. Also 32-bit works everywhere, whereas 64-bit doesn't. Ubuntu tries to be n00b-friendly and recommending the 32-bit version stacks the odds in Canonical's favor that inexperienced users will have a better first impression during install and first use.

We've now crossed a threshold as there are millions of Windows 8 computers with UEFI by default which won't boot 32-bit operating systems by default.

Been using 64-bit for the last 5-6 years I think, never experienced any major issues with it. Only weird isses I've had were browser plugins (java and flash), which were a bit dodgy on 64-bit, but that was years ago.

  esennesh@lap282:~$ sudo do-release-upgrade
  Checking for a new Ubuntu release
  No new release found

This now is working for me.

Their timing, especially for us in Europe is really bad.

If they announce update it should be available for us before we start doing actual work.

They didn't announce the new version. The post is premature.

Now they did.

I have been using it for about an hour on my x64 desktop. I was using Ubuntu 12.04 beforehand. The 12.04 is still on one of my disk partitions.

Canonical is apparently going more in the way of pre-installed crapware. There was an Amazon icon on my Launcher, which I removed. People have also been warning the Ubuntu Dash sends your searches to Amazon, I went to System Settings -> Privacy and removed that.

Haven't noticed that much. It may be a little faster, although my Ubuntu desktop usually runs faster when I reboot it, so I can't tell if it will stay this way yet. Doing horizontal resizing of windows seems a little more difficult, some setting must have changed. Firefox's plugin service seems hosed right now, maybe due to all the new Ubuntu installs. The Firefox edit window is acting a little odd as well. It has crapped out a few times in terms of display.

There were no workspaces until I turned them on in settings, they seem down on workspaces - something I had without a problem since fvwm back in the mid-1990s. Hopefully there will still be workspaces in the next Ubuntu version.

It seems like they've added a little more customization ability for the launcher, sizing the launcher, hiding the launcher etc. Or at least made it more obvious. It doesn't work that well but at least it's there.

I noticed they changed other things...grub now just says Ubuntu on top, and has simplified options under that. I guess I'll bump into more things as I go on.

>> Canonical is apparently going more in the way of pre-installed crapware. There was an Amazon icon on my Launcher, which I removed. People have also been warning the Ubuntu Dash sends your searches to Amazon, I went to System Settings -> Privacy and removed that.

Thanks for the tips, when we move to 14.04 LTS we will need to know them.

>> There were no workspaces until I turned them on in settings, they seem down on workspaces - something I had without a problem since fvwm back in the mid-1990s. Hopefully there will still be workspaces in the next Ubuntu version.

Do you mean the cool workspace switcher on the Unity Launcher panel is going to be gone? That's odd.

The workspace switcher is still available, and I don't think there are any plans to remove it. It's just disabled by default. I guess it's a mixture of saving space in the launcher and avoiding the potential confusion when new users hit the wrong button and suddenly see their windows vanish.

Doesn't seem to be any point for me [1], I don't use Python, or Libreoffice. And I use gnome-session-fallback, can't stand Unity, and just wish they'd give me an OOBE setting for having an 'old-school' Gnome 3 session!

Although Linux 3.8.8 might well make me upgrade.

[1] https://wiki.ubuntu.com/RaringRingtail/TechnicalOverview

Don't update, then, and use whatever distribution suits you better. It's one the greatest things about Linux, choice.

No change/upgrade will ever fit every user. I use Python, Libreoffice (not that these are really dependant of the new release) and I like Unity, can't wait to try it out.

You do not have to change distributions because you do not like the desktop, you can simply install another one.

That's not entirely true with Ubuntu. With most other distros, yeah, you don't need to change everything else to change a DE, but because unity integrates with everything, which means that Gnome and Cinnamon don't work well with ye-standarde Ubuntu.

Its been a while since I configure my CPU, but I am currently running Ubuntu 12.10 (upgraded incrementally from 10.04), with an alternate desktop (Awesome). I have never* noticed a program not work because off this, including the applet icons. The main point to consider in this is that most of the programs that Ubuntu runs are not written, nor maintained, for Ubuntu. Therefore, they should work perfectly well with other desktops. The only incompatibility that Unity introduces is when you are actually using Unity. When you are using another desktop, a program that works in another distro should work just as well. If you install, for example, the package "mate-session-manager", you will install a traditional desktop, and still be running Ubuntu. There are many other desktops to choice from in the Ubuntu repositories. Admittadly, installing one not in the repos will be more complicated.

* I have actually run into 1 problem (with java) arising from the fact that Awesome is a non-re-parenting WM. This rarely came up, and was solved by claiming to be running 'LG3D' as my window manager, as that is on java's hard coded list on non-re-parenting WM`s.

To be fair, that's a Java issue and not an Ubuntu issue. It came up for me with xmonad on another distribution entirely.

The only distribution of Linux I've used is 12.10 (and now 13.04). I use the Cinnamon desktop, and it seems great, I've had no problems with it. So now I'm wondering, could you explain a little bit what I'm missing out on?

Every time I've installed Cinnamon on Ubuntu (the main one, not a different flavor) there have been integration problems because Ubuntu has a bunch of patches on regular GTK stuff and hooks into it.

There is a GNOME flavor of Ubuntu. You can simply install `sudo apt-get install gnome-shell`.

They supposedly address the privacy bug:


Well, I wish I saw the other review (or downloaded it) before getting my hopes up. It looks like they didn't fix f*ck all.


Note to self: continue to expect the worst in people.

(Edit: no malice understood, just brooding...)

There's a simple off switch for all online results in the dash, and it works. The dash is being improved so you can disable individual result sources, but that work wasn't ready in time for this release. I don't think there's any particular malice in the delay.

Just recently switched to Mint (Cinnamon) and found it to be quite good actually. Using the search via the start menu actually works! Can't say that this was always the case with Unity.

Are there any security improvements worth mentioning in this release? Ubuntu used to be at the forefront of security on Linux and kept track of improvements made per release on their wiki. This effort seems to have stalled after the departure of key personnel and nothing has changed in a year or more.


The Security/Features page remains up to date (mostly, see below) and while new security features did not land in 13.04, a lot of work was done on client application isolation and better supporting using AppArmor with LXC. Many of these features will land in 13.10 and 14.04 LTS.

The list of applications compiled with PIE is not currently up to date, but is being worked on (part of the problem is a lot of developers started compiling their packages with PIE and we now have to find them. A nice problem to have :). Also note that incremental security features are not usually mentioned. For example, seccomp2 is used in a number of places now and AppArmor improvements such as mediation of mount or the upcoming DBus work are/will not be specifically mentioned.

Is it now possible to perform basic tasks like specifying font and icon sizes without installing third-party tweak tools or manually editing text files?

I know a lot of work goes into these releases but when basic desktop functionality has been missing/broken since the 11.x series and fails to be addressed with each new release it starts to become depressing.

Ubuntu is a lot more tolerable when you just decide to live with the fonts, window decorations, icons, as the defaults. Just like you have to on the Mac.

I could tweak a Linux box all day long. On a Mac I just turn it on and go to work. Now the hard part is deciding between GNOME3, Unity, LXDE...

I guess I find the icons and fonts sizes for the default Ubuntu desktop are (for lack of a better term) Fischer-Price sized and I just can't comfortably work with it.

While I can deal with installing additional packages in order to gain control over such settings it just strikes me as a little ironic that such things are not easily configurable in one of the major desktop focused Linux distributions.

This version isn't so bad, only by comparison of the last two. I usually download and run each version for a few weeks for fun, I ran some beta builds of this one and didn't have any major problems.

The Unity interface still sucks but they're making some subtle changes that make it better. I still don't like it.

What worries me most about Ubuntu's rising popularity is the fact that they're trying so hard to nail the tablet market. I would like to see a good Linux distro reach the popularity Ubuntu has, but offer more flexibility and focus on the desktop. Linux Mint is making strides in this direction, and looks to be a very nice candidate for it.

My prognosis is much the same as it has been for a while now:

For goofing off / watching movies/ checking email - Ubuntu 13.04 will be great.

For productivity I'm still using Mint, Gentoo or Ubuntu 10.04

For a server it's Gentoo all the way.

What happens after the Z release (coming up in 5 years)?

Perhaps they start from A again. If they really want to challenge themselves they should start with Aa.

Aahing Aardvark

Oohing Oompaloompahs

An 'A' release, probably. I suspect they'll be like hurricane naming in that regard.


I've been using 13.04 on a new MacBook Pro for 2 months now. Everything is great except that I cannot seem to get sound working no matter what I try. It looks like it recognizes the hdmi sound output, but it can't seem to see the normal sound output. I am a sad man.

Thank you very much! I don't think that this existed when I installed it, but I will try to get it working at some point when I don't have client work.


Thanks again!

They still haven't updated the main page, nor the download page - http://www.ubuntu.com/download/desktop still suggests to get Ubuntu 12.10.

Mmmm Debian based distros rock. That's why I run Debian

Can't wait to get back from work to update from 12.10 !

Does anyone know if this will run on the Raspberry Pi (at least the Ubuntu Mini version)?


Looks like it never will, unless there's a Raspberry Pi with an ARMv7 (or up) arch.

Is there any particular reason you want it though? I use Raspbian on mine, and it works just fine. I couldn't be arsed with a GUI though, since the RasPi is just too underpowered to run a GUI alongside all the other crap I have running on it.

Not really, just because of familiarity with Ubuntu Server. Raspbian has been pretty great so far, no complaints here.

Raspberry pi has an arm6 chip I think and the minimum for ubuntu is arm7. It could be done but the OS would require considerable modification.

No it won't because the Raspberry Pi uses an older ARMv6 CPU type that isn't supported by Ubuntu.

Any particular reason or use case for this?

A friend has received his Pi yesterday, hasn't set it up. Can ask him to try it :-)


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact