* Skype won't start (without a fix )
* Google Chrome won't install (without a fix )
I don't think everyone is on the same page about Ubuntu being a consumer distro and not a hacker's only distro... I have definitely noticed a decrease in the stability of Ubuntu releases over the years and it's attitudes like this that seem to be in part responsible for it.
Oh I kept reading, this is better:
"Rainer, a better solution for international calls is to stop using Skype (which is closed source, and frequently has problems with its Linux client) and change to a service that supports standard protocols (SIP etc.), e.g. using Ekiga"
Yes, the client should change their software and infrastructure to better suit your product...
This makes it pretty much impossible for me to "sell" Ubuntu to any of my friends who aren't programmers AND tinkerers (many of my programers are too busy being productive to be interested in dicking around with making things work anymore, and increasingly I hold this view).
This. I switched to OS X from Ubuntu about a few months ago simply because of this reason. I was too tired of tinkering and configuring endless .conf files, adjusting my GNOME/Xfce panel layouts for optimum productivity, and changing my display manager because all of them sucked.
In my final days with Linux I was running Ubuntu Minimal with twm -- I went completely old-school. Fewer things at the core, fewer things to go wrong.
I caved and bought a MacBook Air. While I miss the endless configurability of Linux, I find that most of the time I spend trying to configure OS X is only spent to make things more suitable for my workflow -- changing the number of desktops I have, changing the applications in my dock, or adjusting Exposé to show all app windows.
I never have to configure something because it breaks or isn't compatible with something else. I could have gone the FreeBSD way, but then, well, you know. OS X just works. As a Ruby and Java developer I find it perfect. I probably will try the 13.04 release on the Mac (I tried the beta and liked it), and I'll see how it goes.
It looks like a solid release, so I'd really like to keep it (but I probably won't because I'm cheap and only have the 120gig SSD).
But I don't spend days tweaking the UI, I have a baseline of UI expectations that's not too difficult to meet. (But Windows 8 doesn't meet it... 'nother story for 'nother day...)
Something is wrong here. The massive shift of devs from linux to OSX should warn some people somewhere about the overall quality of their stuff. A quality that we have been proud for decades.
As a Mac user that runs triple boot, the "just works" mentality in OS X is present only as long as you don't step out of the typical use case scenarios. It's great to say be a writer on OS X; there are very good tools and apps to write in a minimal, esthetic environment (WriteRoom and others). On the other hand, as a "hacker", I've always had trouble in OS X -- from Page Up/Page Down not registering properly in Terminal, to full NTFS and EXT4 support (FUSE used to work, but I think it was not maintained) to getting vim, gcc and other tools of the trade working on OS X (yes, there are ports-like systems on OS X, but hardly ones that I'd say "just work").
On the other hand, in pure Linux world, brightness controls, backlighting controls, volume controls and multiple sound card support were all rather tricky when I tried using Gentoo on the same Mac for a time. Some of those I solved, but some I just gave up and was content that it kinda-but-not-completely works.
Ubuntu seems to be tested enough on Macs, so those things work on my laptop now without having to configure too much. It may not be the perfect "just works" experience, and I have seen a lot of things going the wrong way on Ubuntu, but currently it's a good mix of being easy to maintain and being easy to start doing actual work with.
There's a huge QA process for Ubuntu now -- packages have to go through a large swath of automated tests, and the alpha/beta versions are kept working on a daily basis. It was only 2 years ago or so that most Ubuntu developers didn't even expect the unstable release to be usable until the feature freeze milestone had well passed.
This is a sign of maturity. Time is our one true finite resource, and we need to prioritize what we do with it.
So his recommendation is proven to work and yours is....?
In the end, if his purpose was to install Ubuntu, then good for him. However, I wouldn't give it as general advice out of voiding their warranty alone.
So... yes, the user should use software that isn't deprecating support for their platform.
Edit: also, they're still sitting on version 5, which introduced group video calling. It's a pretty sad situation overall.
I don't think this is the problem with recent Ubuntu releases.
That's an interesting attitude.
Backwards compatibility is a generally regarded as a feature. If you read Raymond Chen's blog about the length's Microsoft went to in order to support old software on new versions of Windows you'll see how important some operating system vendors regard it.
Linux generally seems to have an interesting attitude to backwards compatibility. "It works on old hardware" is something that used to be regarded as a huge benefit of Linux.
These days - especially for Ubuntu - it doesn't seem to be regarded as a feature at all. Additionally Ubuntu seems to introduce breaking API changes in every release. I find that to be quite a surprising route to take for an OS that is trying to build desktop share.
I'd be quite interested to hear why people think they do this. Is backwards compatibility just not seen as important at all or is there a bigger strategy here?
They are not obligated, but it is in their best interests to build a reputation of not breaking things.
I do think that Linux lacks a reputation of not breaking existing programs as much as it lacked a reputation of ease of use before Ubuntu.
Now, ease of use, and not breaking things? I bet that kind of reputation is worth millions of dollars.
Even Linus agrees about it, when it comes to the kernel breaking userland programs. The kernel developers have no control over userland programs. Nonetheless, the kernel developers try very hard not to break userland programs.
Ubuntu bug #1, for reference https://bugs.launchpad.net/ubuntu/+bug/1
The person you quoted isn't even a developer as far as I can tell.
Though it's not web based like imo.im, you also don't have to give your access credentials to a third party.
Oh, there is no download for linux... I'm not much for browser chatting.
I disagree, and I'm glad that most people disagree as well. User-friendliness is important regardless of what system it is, even Linux. Have you noticed that the most successful distributions of Linux are the ones that are most friendly to beginners?
This elitism is destructive and serves only to make the "experts" smug and keep everyone else in the dark. I've found that the real experts are the ones who are writing guides for solving issues so that beginners can solve them too. That way, everyone benefits... and the beginners get to learn too. Almost all of my experience with config files has come from having a problem, Googling it, copy-pasting some commands into the terminal, and then becoming interested and messing around with it.
Ubuntu has two commands for this, useradd and adduser, where adduser is the easier one for newbies, while useradd is more flexible but has a lot of ways you can get it wrong. In fact, 'man useradd' recommends using adduser instead.
Unfortunately, that how-to page starts with useradd, and the first example on the page is:
It isn't until later on the page that it mentions that there are some options you must add to the command if you want it to set up everything correctly for a typical user. And if you leave these options out, the page doesn't explain how to fix it after the fact.
Finally, near the bottom of the page it mentions the adduser command that would have worked with no special options and no hassles.
So, being at the time a Linux newbie, I worked through the how-to and tried the useradd command it suggested, got myself into a situation where I wasn't sure exactly what I'd really done and hadn't done, then finally discovered the adduser command I should have used in the first place. I posted a reasonably polite complaint that the how-to had led me down the wrong path and it ought to start with the recommended adduser command instead of just mentioning it in passing at the end.
The responses I got were fairly eye-opening. (Click the "show archived reader comments (36)" link to see them.)
This one was my favorite:
> Michael Greary [sic] you are a D&%K H#$&D. You’re the 1 that messed it up no1 else, and anyone else for that matter.
> Hopefully you’ve learnt by now that when blindly running linux commands you should read the whole article to make sure it’s what you want first.
> Blaming it on other ppl isn’t gonna help either. You should’ve said it by admitting your mistake, asking for a way to rectify it.
> (sigh) I bet you have no friends
I have to admit he was right. Clearly, I am not worthy of using Ubuntu!
I searched for an alternative. The second Google result for 'add user ubuntu' is: https://help.ubuntu.com/community/AddUsersHowto
There are problems:
* The article has a pants title - no novice in their right mind is going to click the link over the first result with the better title
* It begins with a tag saying it's out of date
* It's marked as immutable so those who care can't make it better.
What can we do to make these terrible articles go away?
So, yeah, you
a) didn’t read the manuals supplied with your distribution
b) instead went directly to a random third party site via Google
c) blindly trusted said site without fully reading it
d) messed up while doing so.
You should have gotten a computer-literate friend to help you, just as you don’t fix your car by looking for an howto online and dismissing the manual you can usually find in the glove compartment.
The thing is, of course I know I screwed up. I don't need you or anyone else to tell me that. I already know it.
But that has nothing to do with my complaint: the top match for a Google search for "ubuntu add user" is a site that gives out poor quality information, doesn't improve that information as a result of user feedback, and has a community that is positively hostile to feedback. Yes, as you point out, there were also helpful comments in that thread. There were also several other people who had the same problem I did, and the site never did anything to improve its content.
Isn't it obvious that this was a lousy tutorial? Doesn't the author of a site like this have any responsibility to put out useful information? Does the fact that I screwed up make my feedback any less valuable? If you're writing tutorials, that's exactly the kind of feedback you ought to be looking for, so you can improve your tutorials.
Contrast it with Arch Linux's page on user management:
> To add a new user, use the useradd command:
> # useradd -m -g [initial_group] -G [additional_groups] -s [login_shell] [username]
Here they do use the same useradd command that I was complaining about (probably because Arch doesn't include adduser by default?), but they give a complete example with all the options required, followed by an explanation of those options.
Now that's how you do it.
But we're all newbies at something, some of the time. In my case it was basic Linux user setup.
Maybe you even screwed something up once, as a result of somebody giving you bad information that you didn't thoroughly check out. (Not saying you did, just possible.) If that ever happened, which would be a better response from the source of that information: chewing you out thoroughly, or correcting the bad information?
Except... we have douchebags in the community who talk shit to newbies and then cry, "Why does everyone use Windows? How come Linux isn't more widely used?"
Just check your post history and read the number of times you wrote "RTFM." That's why.
Not being careful of that second one gets us things like the GNOME 3/Unity/Windows 8 design-by-committee that seems to be loathed so much by power users.
Is it so wrong to expect some basic minimum level of competence in a system before someone uses it? And then suggest that someone take the most basic of steps to acquire that competence before asking questions? (By R'ing TFM?)
Systems should as a rule be somewhat intuitive (IMNSHO), but you can only take that so far before you start hamstringing yourself. Devs would never get anything done if they spent all their time training people how to use systems.
Remember that everyone starts out incompetent. They ask dumb questions, and in some cases entitled newbies expect the experts to bend over backwards for them. But one thing that's nice is that a lot of questions have been answered before by patient experts.
Say there's a guy on the Ubuntu forums who asks the following question:
"Hey, how do I change my desktop resolution?"
An asshole answer is "Fucking Google it, idiot. It's not hard."
A decent answer is, "Here's a link to the answer. In the future, I suggest Googling your issue. You don't have to wait for us to answer it! Often, questions have been asked and answered before. We're here if you get stuck, though."
This has a few benefits. The first (obvious) benefit is that the newbie doesn't immediately think "Wow. What a dick, I was just asking a simple question."
The second benefit is that future newbies who Google the same question get a link to the answer.
Of course, there are lost causes (Where's the Start button? This isn't like Windows at all. I don't like it) and asshole newbies who simply refuse to try to learn anything by themselves, (Ok, you answered my question on desktop resolution. How do I change my desktop background?) but I think these people are the exception rather than the norm. And they can be dealt with in a way that is constructive rather than crass and uncouth.
Fortunately, you can easily killfile people in mailing lists, and doing so quickly (i.e. after one wrong post) helped me greatly.
No. What I am saying is that a) using a device is different from administrating it and b) in order to administrate a device you have to be an expert. You don’t expect to buy a car and never ever go to a workshop with it, because, you know, you shouldn’t have to be an expert to service your car. Similarly, expecting to be able to service a PC without being an expert is as futile.
> Have you noticed that the most successful distributions of Linux are the ones that are most friendly to beginners?
What metric/definition of successful are you using? 
> (third paragraph)
Exactly. Your point being?
 Hint: Number of users is not a measure of quality, and hence not necessarily a measure of success.
I am what you would consider an "expert" (or at least, I can definitely competently administer a linux computer) but I don't use linux distributions for day-to-day computing. The reason is that linux computers, even ones running "popular distros" such as Ubuntu, are considerably more prone to interoperability foibles and installation/updating problems.
The reason popularity matters, even though I have the technical skill to administer any OS competently, is that popularity is the single best incentive to encourage third parties to iron out the kinks on a particular platform. The reason AMD drivers work better on Windows than they do on Linux is not because Windows is "better" by any of the metrics of quality you espouse, it's because Windows is more popular. The reason Steam for Linux is only supported on Ubuntu is not because Ubuntu is "better" it is because it's the most popular distro. The reason to use popular platforms is because all of the little things that require a half-hour of expert time to work around would have already been solved by the vendor who recognized that such issues are worth testing for and fixing for a platform with 100 million users, but not for one with 20,000. That's distasteful to some because it doesn't seem meritocratic and it's out of the control of the creators of the OS, but it's a fact of business, and it does affect the quality of an OS in tangible ways.
As for the "expert" thing, the costs really aren't that different for various kinds of users. The reason I care about this stuff is because the opportunity cost of my time spent solving menial issues that would have been solved by the vendor if I were on a more popular platform is too high. The reason my mother cares about this stuff is because the transaction costs of finding someone who will spend the time to figure out what's going on and fix her computer is too high.
I prefer a working platform that forces me to read the manual beforehand over a platform where I don’t have to read the manual but random third parties iron out kinks.
Popularity is a measure of quality, but only of quality wrt to the usecases of the people among which the product is popular. My computer usage is very different from my mum’s, and likely from 95% of the population, hence, the popularity of a product among 95% of the population is rather irrelevant to me.
If you’re a single person with a tight budget looking for a car, you don’t look at car popularity among large families owning oil fields, but among people similar (in the relevant ways) to yourself.
Hence, popularity among a large user base is an indicator that the OS is optimised for use by a large part of the population. If you find that this is indicative of the quality you’re looking for, fine.
Windows is tons better than OS X then?
I don't mean that it's okay to be completely illiterate in how a computer works, especially if you're using Linux. To take your car example, I don't think that a driver should know how to rebuild his engine. I do expect the driver to know how to change his oil and brake pads and know about scheduled maintenance.
The same is true for computers. A Linux user should know basic bash commands, how to install software, how to research problems, and so on. But he shouldn't be expected to know how to troubleshoot driver problems or know the ins and outs of xorg.conf. That's what Google and the aforementioned benevolent experts writing guides are for.
Yes, they all should just "work out of the box", but in a more and more literal sense we trust a lot of machine with our lives. We ought to understand all of these machines better before we trust them with our lives or vote on the leaders who will regulate their use.
Of course, sometimes those abstractions are leaky, and we have to change a tire or apply an update. But suggesting that typical users should read up on SSL (and therefore asymmetric cryptography) before they use online banking is as ridiculous as saying that every new driver should study the detailed workings of the internal combustion engine. Someone needs to know how it works, but my mother really doesn't.
The certificate errors are kind of a design failure. Users are so used to clicking through messages they don't really understand, and this is just one more. Even the most alarming certificate errors are usually innocuous, in my experience - someone forgets to renew a certificate, or fails to get a new one for a domain change. That doesn't mean it's OK to ignore them, but we really need systems that don't cry wolf unless there's a really good chance that there's a wolf.
How do you propose to algorithmically determine if a certificate not matching a domain change is innocuous? This is a bit like suggesting having a "check engine" light that only comes on when there's an immediate risk of an engine fire.
Secondly, the system should make it harder for the administrator to make mistakes. For instance, the web server could refuse to serve https on a domain that it didn't have a certificate for. Or when a certificate nears its expiry date, the administrator should be getting plenty of reminders about it.
Thinking bigger, what if we tied it closer to the sensitive UI? What if only pages loaded securely could show a password field? What if rather than typing credit card numbers into boxes on a webpage, we were used to using a special browser interface that would only light up on secure pages?
"Multi-level insecure operating systems may have special levels for
attack programs; the evil bit MUST be set by default on packets
emanating from programs running at such levels. However, the system
MAY provide an API to allow it to be cleared for non-malicious
activity by users who normally engage in attack behavior."
This should say that the system MUST provide such an api. Otherwise, a you can find a situation where the reciever may have to allow evil packets to pass, because they are sent by a user who has no option of setting the evil bit to 0 for non-malicious activity. Kind of defeats the point.
"Fragments that by themselves are dangerous MUST have the evil bit
set. If a packet with the evil bit set is fragmented by an
intermediate router and the fragments themselves are not dangerous,
the evil bit MUST be cleared in the fragments, and MUST be turned
back on in the reassembled packet."
This places an undue burden on intermediate routers, as they now have to parse the packets and determine if a fragment is evil.
"Intermediate systems are sometimes used to launder attack
connections. Packets to such systems that are intended to be relayed
to a target SHOULD have the evil bit set."
These packets are not 'evil' in any technical sense. They are only a request for another computer to send evil packets.
"In networks protected by firewalls, it is axiomatic that all
attackers are on the outside of the firewall. Therefore, hosts
inside the firewall MUST NOT set the evil bit on any packets."
This is just stupid. If I am attempting to send malicious packets from within the firewall, I am already required to set the evil bit. Now I am also required not to. If I am sending a non malicious packet, I am already required to set the evil bit to 0.
Also, If I am on a computer behind a firewall, and attempt to send malicious packets to a computer outside of my firewall, I am still forbidden by this to set the evil bit. This will force attackers to either, break the spec, or give up their own firewall.
"Because NAT [RFC3022] boxes modify packets, they SHOULD set the evil
bit on such packets. "Transparent" http and email proxies SHOULD set
the evil bit on their reply packets to the innocent client host."
I must be mis-understanding this one, becuase non of those uses seems evil.
"Devices such as firewalls MUST drop all inbound packets that have the
evil bit set. Packets with the evil bit off MUST NOT be dropped.
Dropped packets SHOULD be noted in the appropriate MIB variable."
Dropping packets is a technichal inevitability, and a crucial part of TCP's anti-congestion stragety.
Just becuase the evil bit sounds like a good idea, does not mean we should go ahead and implement it without carfully looking at the technical implecations of the spec.
I agree that learning about SSL is probably overkill; but we do need more computer literacy among the general population. Think of it like this: finance is something that is hard, yet people learn about the basics so that they can keep their money safe. Basic computer literacy should have a similar status as a skill that is required to, say, keep your information safe.
edit: another, perhaps more humorous example. How many times have you heard about people "hacking" their Facebook accounts and getting "viruses" on Facebook. I've been asked by multiple people to fix their computers, when the real problem was they had no idea what they were doing on the Internet.
Use the product you feel is best suited to your task. If that is Windows, use Windows.
Edit: That said, you will likely want someone knowledgeable around when setting up Windows or Mac OS X, too, so I don’t really get your point.
I didn't pretend that people never needed help with Windows. I said I hadn't had any real problems with it since 98 and asked what he thought the average user needed a sysadmin for in Windows.
I can see how you'd come to the conclusion you seem to have, but I really don't appreciate being made out to be a liar. If you act on a similar assumption in a similar manner in the future then I'm going to ignore anything else you have to say.
What does seem relevant to me is that there's a dramatic difference between an operating system that even programmers get tired of trying to make do what they want and jump ship from and one that you can reasonably expect not run into significant problems with for years. If Windows were equally messed up as Linux I'd expect to run into the problems at around the same rate.
How long does it take you to run into trouble with Linux? I tried it just now, downloaded Mint and threw it in the old DVD drive - 15 minutes. Most of which were spent trying to work out how to change the primary monitor so I could get the taskbar onto the one I wanted. When I did manage to do so the graphics corrupted and the system froze irrecoverably.
As for your ubuntu issues - you booted off a liveCD. Good for you. Now, how many liveCDs have you seen of Windows? How widespread are they?
Anyway, the same problems apply in setting up the system regardless of Windows or *nix - you need a power user to get you through the humps. Good luck installing XP as a normal user when you run into the ever-present problem of no appropriate network drivers. Basically you're comparing apples and oranges.
There's a limit of naivety beyond which everyone will need help, yes. I simply believe that you can get away with being more naive with Windows most of the time. I don't have to fix my mother's Windows problems, or my sister's - from time to time I have to fix my father's, but even then not just for setting the thing up.
> As for your ubuntu issues - you booted off a liveCD. Good for you. Now, how many liveCDs have you seen of Windows? How widespread are they?
I think I've seen two or three versions over the years, mostly as repair tools environs. Why?
> Anyway, the same problems apply in setting up the system regardless of Windows or *nix - you need a power user to get you through the humps. Good luck installing XP as a normal user when you run into the ever-present problem of no appropriate network drivers. Basically you're comparing apples and oranges.
Windows Xp network drivers - that's three versions out of date. And really, where are your discs? If you got it ready made they should have sent you the drivers along with their bloatware and if you didn't then they should save sent you the mobo drivers with the mobo.
And also to be fair, XP is in a valid sense only one version out of date. Vista was a fiasco, and that's not just a meme -- Vista was released way ahead of schedule and still had a ton of known unresolved bugs. It was by far MS's most buggy release, and the whole thing was a disaster. Compared to the relative stability of XP, Vista was a step backwards. And while Windows 7 can rightly be called a newer version of Windows XP (and is better in nearly every way), it would be a stretch to say that Windows 8 is a newer version of Windows 7. Windows 8 is a fork into a different paradigm, and is so different it's not even comparable to Windows 7 along a number of dimensions.
All that said, XP is still pretty damn out of date, even with the latest service pack installed, and if Windows 7 is an option, it's usually a better one. Of course, XP uses less memory, so it may be suitable for particularly old computers. And for people that don't like to pirate software, owning a copy XP is a great reason to stick with XP (if it works well enough, why fork out the money for Windows 7?).
My personal experience (as someone who has been dual-booting Windows and Ubuntu for nearly a decade and uses both operating systems regularly) is that both have a lot of annoying problems, but Windows has always been way more of a pain in the ass, and I've run into more problems using it. Especially when it comes to driver issues, Windows has problems (it took hours to get a standard WiFi card to work, and once my video driver got corrupted so badly the system wouldn't even let my try to fix it, and restore failed -- I had to reinstall the OS to fix it).
P.S.: RedHat is selling server versions and support for millions of dollars. Ubuntu still is not profitable despite all they have done for the community. It's unfair to me.
Chromium works fine.
Ubuntu, if you're reading this: please consider testing your releases more thoroughly.
It's crazy how many OSX features come from the unix world but in a delayed, better looking fashion.
The last missing major piece for OSX was good multiple workspace management, which came in the form of mission control and the touchpad gestures that makes it so efficient. I don't know if its any good with a mouse though.
Efficient multiple workspaces is a critical productivity feature for me. It's like having multiple monitors without having to carry a desk and a bunch of monitors with me all the time.
The previous OSX implementation called 'spaces' took too many actions to switch workspaces unless you had your hands on the keyboard and had setup shortcuts. This made OSX unusable to me.
Unfortunately, Ubuntu also regressed by getting rid of the single-click workspace switcher (which was working beautifully since like 1995) and doing a multi-workspace UI that resembles Lion's 'spaces'. However, I was able to remedy this by setting my middle mouse button to bring up the workspace switcher which is really fast.
The other Ubuntu regression which annoys me to no end and again is the same in OSX is that clicking on a dock icon when you are in a workspace and there is an instance of the app opened in another workspace makes you automatically jump to the other one instead of opening a new instance in the current workspace.
This assumes I separate applications on a per workspace basis which is stupid. I already have a dock to get access to different applications. Workspaces are used to separate projects or workflows.
Each workspace needs its own browser window that holds the tabs related to the project being worked on and also its own terminals and text editors. I don't want to be yanked out of a context because I tried opening an editor and there wasn't one already in the workspace.
Hopefully this is eventually fixed in both OSX and Ubuntu.
If you middle click the icon it will open a new one instead of taking you to an existing one.
I just tried it and it does open a new instance _and_ go to the switcher which is not great.
Also most of the time if there is already an app window in the workspace, I would want it to be made visible not a new instance to be opened.
Similarly if you use a keyboard shortcut to open your icons (like Win+1 to open the first icon) holding shift while pressing the shortcut will open a new window (Shift+Win+1 will open a new window for the first icon).
(There are several valid reasons for switching from one linux distro to another, it just makes me sad to see so many "I don't like Unity so I'm moving to Gentoo" posts -_-)
Unity has a huge flaw, though: to find a program you have to know its name. That sounds silly to non-nixers, but for instance some are known as one thing but actually are named something quite different (e.g., 'Document Viewer' = 'Evince'). In cases where unity doesn't recognize both names this can be a problem.
I used to learn the command-line invocation by looking at the shortcut in the menu. No more in unity (at least I haven't learned how).
It's also extremely satisfying in a mischievous sort of way whenever I walk into some public place and pull out my T410 with that setup. I can't count the number of stares I get. One guy even asked me why I used a laptop from the 1990s.
Installing Cinnamon on Ubuntu was broken, and many things didn't integrate like they did with the default install, which is why I just switched to Mint in the first place rather than screwing around with UI settings.
I got fed up with the DEs and switched to #! after about 2 months.
I still think it's Openbox, Wmaker, or die.
In short it's more of a pig for a user to configure their desktop style than it was in Windows 98.
So yes you could swap your theme, but I have yet to find many professional themes, and even the better ones like Bluebird, have issues.
This is one thing that really infuriates me about new android devices. Their lack of syncing support with linux boxes.
Does anyone know if this support is something that can be backported to 12.04?
Is browsing/copying laggy or slow at all, like on Windows 7?
The Nexus 4 cannot unmount its internal storage, so something like MTP is required.
Currently i am thinking of switching to LMDE (Linux Mint Debian Edition) because i don't really like the way Ubuntu is going with Unity, Mir, upstart, doing too much stuff on its own.
Anyone with experience from LMDE here? Is it stable? Pros? Cons?
Xubuntu is still lead by Ubuntu though and it currently uses upstart. I'm not sure if Xubuntu will use Mir in the future.
The top feature on this page is a deal-breaker for me: http://xfce.org/about/tour
A Linux distro with no local man pages? Seriously? I sometimes use my laptop in places that don't have wifi. Not to mention the fact that the online manpages may become out of sync with upstream, or that the online manpage may describe a different version than what you have installed locally, e.g. if you're using an LTS version that's years old...
Since then I've moved on to CrunchBang , which I absolutely love. For me, it hits the sweet spot of having everything I need without all the extra pizzazz (cruft?) of something like Ubuntu or Mint.
There is an ongoing issue with Mint Update, but I'm aware a patch has been submitted. In the meantime I just run sudo apt-get update && sudo apt-get dist-upgrade -y && exit` every day. (For whatever reason kernel updates get held back if apt-get upgrade is run instead of apt-get dist-upgrade.)
Mint is awesome and one of the better distros out there. Linux Mint 15 looks like it's going to be a decent improvement too.
I started using #! sometime ago. I agree with the lightweight aspect of it very much. One session on openSUSE on KDE reminds me of how lightweight crunchbang is :-D
For people already on non-ubuntu debian systems, I would recommend taking a spin on it.
My feeling is that Fedora has the better engineering, but Ubuntu has better polish. Korora puts some icing on top: good font rendering, Adobe Reader, Skype, nvidia drivers.
Fedora stays closer to upstream than Ubuntu. They just package Gnome 3 for example.
If Unity had a functional Alt+Tab interface (swapping between open windows instead of open applications) I'd have no further complaints at all. I've grown used to it. Can't wait to see how 13.04 improves things.
Honestly I try to use E17 whenever possible (there is a bug against vmware player that makes it impossible to use them together at length, virt-manager and kvm-spice is only so helpful in comparison, but that means for me it's mostly E17 and some time in another system with Windows loaded in a VM)
In E17 the Alt+Tab behavior works as described (within the current desktop only, to keep large numbers of windows manageable... this may be an option, if you want Alt+Tab to switch between all open windows, there are a lot of options)
I know it's fashionable to use the desktop environment that comes with your distribution/pick a distro that has the DE you want, but I like to push E17 at every opportunity, it can usually be installed in any distro without too much trouble, and especially now that you can't claim it will "never be released" there should be stable packages in your modern repos :)
I don't care too much about the next Ubuntu release personally, I am waiting for eLive Gem ~3.0 (whatever version is next) on Debian Wheezy which should be out soon!
Then again, YMMV and everybody is probably better off making their own decision here :)
And the best thing was it didn't depend on Ubuntu. I said this because I liked even Mint Desktop better than Ubuntu which it was based upon.
Since then I've switched to some other OS.
13.04 works absolutely wonderfully, on my 2012 MacBook Air. Everything works perfectly for me on that machine.
On that screen size, OSX will appear like the top image on a non retina display, and like the middle image on a retina display. Ubuntu will display like the top image on a non retina display, but will appear like the bottom image on the retina display.
As if that wasn't enough, the 13.3" version is 2560x1600. Which is completely insane.
Eventually, I gave Kubuntu 13.04 beta a try and, surprise, it works perfectly. No Unity nonsense, X works great.
It's a bit strange switching to KDE now. I've actually used KDE 1.0 back in the day, and Gnome 1.0. I've been a Gnome user most of this time. But now I'm starting to find even plain Gnome quite frustrating.
I guess I'm just not yet used to "proper Linux" being something you can download for embedded-type architectures and devices without heading to XDA first.
Still. Definitely Good to see Ubuntu expanding support to the sort of devices which are actually new and interesting.
But the stock Intel driver works well enough, just don't want to figure out what sort of CLI cargo cult fixing it will take to get it going.
I'm not blaming bumblebee specifically here, it's just a pervasive side effect of Linux and the modular approach. I can really appreciate the integration that goes in to a Mac.
pkolaczk does bring up a good issue about the DisplayPort not working properly though. I don't use an external display on my laptop and have never played with that.
On another note, I remember installing 12.10 on my desktop right around the time it came out. My desktop has a Radeon HD3870 video card which was completely incompatible with Unity. I don't remember the issue exactly but my choice was either to use the open source drivers or have the desktop environment fail to show at all with ATI's drivers. I just did a search and I guess the fix is to downgrade X-Server and install a legacy driver. It's a shame something as central to the user experience as GUI performance still doesn't work out of the box or worse yet, critically breaks on an upgrade.
Always use that instead of the official desktop CDs, to avoid all the extra programs that I never need.
I have been using 64 whenever possible. I wonder if any weird behavior can be attributed to it?
Edit: found the source. It was actually 25%, a year ago. And the other reason was that the error message when trying 64-bit on incapable hardware was very cryptic. https://lists.ubuntu.com/archives/ubuntu-devel/2012-April/03...
esennesh@lap282:~$ sudo do-release-upgrade
Checking for a new Ubuntu release
No new release found
If they announce update it should be available for us before we start doing actual work.
Canonical is apparently going more in the way of pre-installed crapware. There was an Amazon icon on my Launcher, which I removed. People have also been warning the Ubuntu Dash sends your searches to Amazon, I went to System Settings -> Privacy and removed that.
Haven't noticed that much. It may be a little faster, although my Ubuntu desktop usually runs faster when I reboot it, so I can't tell if it will stay this way yet. Doing horizontal resizing of windows seems a little more difficult, some setting must have changed. Firefox's plugin service seems hosed right now, maybe due to all the new Ubuntu installs. The Firefox edit window is acting a little odd as well. It has crapped out a few times in terms of display.
There were no workspaces until I turned them on in settings, they seem down on workspaces - something I had without a problem since fvwm back in the mid-1990s. Hopefully there will still be workspaces in the next Ubuntu version.
It seems like they've added a little more customization ability for the launcher, sizing the launcher, hiding the launcher etc. Or at least made it more obvious. It doesn't work that well but at least it's there.
I noticed they changed other things...grub now just says Ubuntu on top, and has simplified options under that. I guess I'll bump into more things as I go on.
Thanks for the tips, when we move to 14.04 LTS we will need to know them.
There were no workspaces until I turned them on in settings, they seem down on workspaces - something I had without a problem since fvwm back in the mid-1990s. Hopefully there will still be workspaces in the next Ubuntu version.
Do you mean the cool workspace switcher on the Unity Launcher panel is going to be gone? That's odd.
Although Linux 3.8.8 might well make me upgrade.
No change/upgrade will ever fit every user. I use Python, Libreoffice (not that these are really dependant of the new release) and I like Unity, can't wait to try it out.
* I have actually run into 1 problem (with java) arising from the fact that Awesome is a non-re-parenting WM. This rarely came up, and was solved by claiming to be running 'LG3D' as my window manager, as that is on java's hard coded list on non-re-parenting WM`s.
Note to self: continue to expect the worst in people.
(Edit: no malice understood, just brooding...)
The list of applications compiled with PIE is not currently up to date, but is being worked on (part of the problem is a lot of developers started compiling their packages with PIE and we now have to find them. A nice problem to have :). Also note that incremental security features are not usually mentioned. For example, seccomp2 is used in a number of places now and AppArmor improvements such as mediation of mount or the upcoming DBus work are/will not be specifically mentioned.
I know a lot of work goes into these releases but when basic desktop functionality has been missing/broken since the 11.x series and fails to be addressed with each new release it starts to become depressing.
I could tweak a Linux box all day long. On a Mac I just turn it on and go to work. Now the hard part is deciding between GNOME3, Unity, LXDE...
While I can deal with installing additional packages in order to gain control over such settings it just strikes me as a little ironic that such things are not easily configurable in one of the major desktop focused Linux distributions.
The Unity interface still sucks but they're making some subtle changes that make it better. I still don't like it.
What worries me most about Ubuntu's rising popularity is the fact that they're trying so hard to nail the tablet market. I would like to see a good Linux distro reach the popularity Ubuntu has, but offer more flexibility and focus on the desktop. Linux Mint is making strides in this direction, and looks to be a very nice candidate for it.
My prognosis is much the same as it has been for a while now:
For goofing off / watching movies/ checking email - Ubuntu 13.04 will be great.
For productivity I'm still using Mint, Gentoo or Ubuntu 10.04
For a server it's Gentoo all the way.
Looks like it never will, unless there's a Raspberry Pi with an ARMv7 (or up) arch.
Is there any particular reason you want it though? I use Raspbian on mine, and it works just fine. I couldn't be arsed with a GUI though, since the RasPi is just too underpowered to run a GUI alongside all the other crap I have running on it.
A friend has received his Pi yesterday, hasn't set it up. Can ask him to try it :-)
EDIT: "sudo do-release-upgrade -d" did the trick. I will keep updated on progress.
This is the closest to any release notes: https://wiki.ubuntu.com/RaringRingtail/ReleaseNotes#Ubuntu_G...
My only problems were in installing it on a new laptop. I wanted to dual-boot Windows 8 and Ubuntu. No dice. Turn off signed booting, wipe the weird set of partitions Sony uses for Win 8 and then things went smoothly. I'm keeping the old machine for Windows 7.
Ultimately, blowing away the compiz configs (rm .compiz*), a dpkg --configure -a fixed the AMD driver and boot-repair from a Ubuntu LiveCD was a wonderfully simple way to redo grub.
Not painless but not particularly difficult to fix.
Migrating was painless, except for finding enough disk space on an external drive to move all my somewhat disorganized and sprawling file hierarchy.
However, anyone on 12.10 should be able to "do-release-upgrade" and upgrade themselves to 13.04.
The downside of that is that 12.10 and non-LTS versions in general only get bug fixes for critical bugs, otherwise the bug fix is the upgrade to the next version, in this case 13.04. So you end up on the upgrade treadmill, whether you want it or not.
I'm waiting for a month at least before installing any new Ubuntu version though. At release they tend to be a little unstable.
Regarding bug fixes and the need to stay on the upgrade treadmill, it's much easier nowadays to stick with LTS versions since they implemented the LTS enablement stack (https://wiki.ubuntu.com/Kernel/LTSEnablementStack). Essentially it's the kernel and X.org of the current release backported to the LTS release, so you don't miss out on new hardware support while getting all the bug fixes and stability that come from the LTS.
That being said, however, I'm finding 12.04 to be extremely buggy when you're using Unity+Compiz.
I really cannot fathom what is going on at Canonical. I get that they want a more friendly UI, because that will create a better user experience for a broader set of users. I even use Unity, and like where they're going with it. But can't they see that creating a flaky, unreliable experience is not actually helping? And that they're undermining Linux's major strength, a reputation for reliability?
By all means, Canonical, keep innovating. But stop breaking shit along the way. If the goal is a better user experience, make sure you're actually delivering a better user experience before releasing.
Number theory not required.
Looks like it was fixed at the last second.
sudo apt-get autoremove unity && sudo apt-get install [xubuntu-desktop|kubuntu-desktop|lubuntu-desktop|etc.]
Or just install the one you want in the first place: http://cdimage.ubuntu.com/ has images for all eight official Ubuntu flavours.
Seriously, it can't hurt you just by being in the repositories if you don't install it...