Hacker News new | past | comments | ask | show | jobs | submit login
Thank you, Ubuntu (psung.blogspot.com)
164 points by rglullis on Oct 2, 2010 | hide | past | web | favorite | 127 comments



I keep try to evangelize the fact that things like apt are one of the biggest wins for the various UNIX-like OS's. In Windows world installing software takes both finding it and configuring the installer. On a Mac you have to at least find it. On Ubuntu/Debian/etc you just have to know the name (or you can search a central database). Also no more Java/Flash player/Acrobat Reader "needs to be updated" messages. You update everything at once. And it can all be scripted. And you can update your entire OS to the next major release in three commands. I can go on and on.


It's so true - when I configure a Windows machine I can't believe what a pain it is to go find and install all the needed software (go to web page... click... find download area... download... double click... click 800 more times...). And then, uninstallation is a mess. That's one of the things that I think is so silly when people say Linux isn't user friendly, that the package management is really so amazing, and people who have never used Linux really just have no idea.


Check out http://ninite.com/. The name is extremely forgettable (I have it bookmarked under mass installation), but it lets you install dozens of Windows apps with one click!


The package manager concept is actually what I hate most about Linux systems and Ubuntu's strategy to not let you upgrade some packages without upgrading the OS itself is insane.

Packages tend to thrust some completely idiosyncratic configuration scheme down my throat that is documented nowhere and that creates conflicts of its own. They don't just solve dependencies they encourage dependencies because they are supposed to be resolved automatically anyway. But they're not. Things break all the time. The Linux dependency hell has far surpassed the infamous Windows DLL hell of the 90s.

I was called to help people fix their Ubuntu installation after some package upgrade had uninstalled half the system. I had an upgrade replace a working but proprietary wifi driver with a non functional "free" one, etc, etc.

Most of the problems I ever had with Ubuntu were caused by that completely broken package management idea. If I want Python 2 and 3 on the same system I can't have a package manager tell me that's not supposed to be when I know full well that it's no problem.


They don't just solve dependencies they encourage dependencies

This is a good thing. Developers' attitude toward dependencies gets very lax when it's too much work to declare them or too much work for the user to satisfy them. Then you get into situations where you install a piece of software and 90% of the functionality is broken, because the developer was "considerate" enough not to "force" a bunch of dependencies on you, so you resort to Google to find the real dependencies, which of course aren't available for your distro. Apt gives developers a flexible way to declare different kinds of dependencies, and the Debian and Ubuntu distro repositories give them confidence that the dependencies will install smoothly for users.

I had much more difficulty with undeclared dependencies back in the day than I have with excessive dependencies now. It's just wrong for software to easily install in a mostly broken state. If you want broken, partly-working software, you should have to do extra work to achieve that, not the other way around.


How about statically-linked builds? Hard drives and memory are cheap compared to the annoyances of having to manage dependencies on the user's system.

Package management is a solution to a problem that doesn't need to exist in the first place.


What do you propose instead? So far package management solves these problems for me:

  1. Upgrading the entire OS at once.
  2. Easy updates.
  3. Almost never having any issues with incompatible software. When issues arise I can easily roll things back or wait until they are fixed.
  4. Never having to hunt for software on the web.
  5. Ability to script server/workstation deploys.
  6. Never having to spend more than 5 minutes on maintaining my workstation.
That last point is huge. How long would it take you to update Java, Flash Player, Adobe Reader, IE and install SP# on Windows?

If there is an alternative way of approaching it I'd be very happy, since it can only be an improvement upon already stellar apt/yum/BSD ports, etc.


I'm only speaking to the dependency-resolution function that makes package management necessary on certain OSes. Public application repositories sharing a standard interface still have some nice features.

But if the applications are all statically-linked, I don't have to worry about dependency conflicts, and some things are a lot simpler, e.g. copying the binaries off to a flash drive to take them to another system, or running two different versions of the same software simultaneously.

On Windows, I don't usually even run installers anymotre; I extract the installers, collect all of the dlls into the application directory, and compress everything with UPX. Once the program is running out of a self-contained directory, I can run it off a thumbdrive, copy it to my Dropbox, etc. Plus I have the option of not updating software when I prefer not to.


I don't really see how your approach to installing Windows applications can scale. Also, pinning will allow you to hold back any packages you'd like.

The only potential problem I see is statically-linked applications issue. However, I think in the case where libraries are open and shared amongst many applications/other libraries, it makes no sense to use static linking. Its advantages are that you don't need to worry about dependencies but that "is a solution to a problem that doesn't need to exist in the first place" since package management takes care of these issues.

IMHO it's better to spend 30 minutes once a year resolving an issue that arose when a Debian/Ubuntu repository got screwed up for a short period of time, than to constantly try to solve the problem of managing software installation/upgrades.


Under Windows and OSX, statically-linked or self-contained packages are indeed the only way to stay sane when deploying an application.

However, this comes at a price: You'll always have to provide a newer version of the application whenever a serious issue is fixed in one of your dependency libraries. This means extra work for you. Also, you'll have to take care of an auto-upgrade mechanism.

On systems with a package manager, you'll only have to provide updates if you fix an issue in the application itself. You also don't have to write an auto-upgrade mechanism. Just upload a new package to your server.

However, if you don't care that much about users' security, statically linking is indeed a lot less work than using a package manager.

(FWIW, I'm the maintainer of Mingw-cross-env (http://mingw-cross-env.nongnu.org/). This is a project that provides a cross-build toolchain for statically linked Windows applications. It builds a cross compiler and lots of free software libraries. We have that kind of discussion from time to time on our mailing list.)


Then some central library has a security hole and half your system has to get updated, instead of just one thing.

Also, you might like http://sta.li/ ;)


Generally speaking, two vulnerable apps is less than twice as bad as one vulnerable app - and if a centralized repository means that all of your apps are fixed sooner they its a big win.


True enough, if the security hole is in a library that half of the software on my system is using. But most applications have fairly frequent update cycles for their own codebase anyway, and it's just as likely that an update to a central dynamically-linked library might introduce a new security hole that instantly compromises half of my system at once.

Thanks for the link to sta.li, I'll check it out. What I'd really like to see is a Linux distro that managed applications the way OSX does - every program is contained in its own directory.


IIRC, they still dynamically link against system libraries. It's just that OSX isn't a moving target with respect to what is expected to be in the system in it's 'base' state of a specific version. So you dynamically link against everything that should be in the system in the 'base' state, and then all other libraries need to be self-contained in the .app directory. The problem with this on Linux is that: 1) the 'base' configuration is a moving target that varies across vendors and 2) most of the libraries/etc that would make up that 'base state' are things that people would want to upgrade/modify/whatever (thereby messing up your 'base state' that you were supposed to be able to rely on) (You don't have this on OSX because Apple is the one that controls most of the things in the 'base state').


"What I'd really like to see is a Linux distro that managed applications the way OSX does - every program is contained in its own directory."

That exists, it's called Gobo Linux. I've never touched it, but you can give it a try:

http://www.gobolinux.org/


Hard drive space is cheap but bandwidth less so. Having only one copy of a library makes downloading, installing, and updating your OS much easier.


Ubuntu's strategy to not let you upgrade some packages without upgrading the OS itself is insane.

You can; it's called "pinning:" https://help.ubuntu.com/community/PinningHowto

If I want Python 2 and 3 on the same system I can't have a package manager tell me that's not supposed to be when I know full well that it's no problem.

Let me check; here on Jaunty I can separately install python2.4, python2.5, python2.6, python3.0, & python3.1.

I don't doubt that you've had problems with the package managers, but, um, I think you're doing it wrong.


I guess what I'm "doing wrong" is to expect sensible out of the box behaviour. I'm fully aware that I can make it do whatever I want to if I put enough work into it. That holds for almost everything in life. What's limited in my life, though, is time.

Installing Python 3 alongside the old version didn't work for a long time after Python 3 came out. If it works now, that's fine. I compiled my own Python 3 a long time before it finally appeared in my system's package manager.

[Edit] And one more thing. I'm not a sysadmin and my interest in system configuration and change management is very limited. But I've been using Linux for a very long time, so if I'm still doing it wrong then many people will do be doing it wrong and that means there's something wrong with the way it works.


On the contrary, you're acting like you desperately want to be a sysadmin by demanding to control which specific version of each package is installed.


No, I want developers to be in control of dependency management.


> The package manager concept is actually what I hate most about Linux systems

You really should give Slackware a try. None of that sophisticated package management nuisance[1] and everything just works!

[1] http://en.wikipedia.org/wiki/Slackware#Package_management


I'm not sure I would say "everything just works" and Slackware in the same sentencd!


I'd say "Everything just works as expected"


> Ubuntu's strategy to not let you upgrade some packages without upgrading the OS itself is insane.

you're doing it wrong


yeah, until the repository maintainers mess up, create two packages that are in conflict, and yet both installed on your system, and now every attempt to update your software fails because you have out-of-date dependencies that can't be updated due to the conflict.

In case you hadn't noticed, I'm a bit bitter about my fedora install at the moment :)

That said, I think you overestimate the delta between OSX and the various linuxes. On the Mac, as with linux, if you want to use a package that isn't part of the core system, you should package it into your app/lib yourself. The big advantage that the linuxes have is that they have a far bigger range of packages in the repository than you have in a mac's standard install. But by the same token, you can't have as much confidence that all the libs play well together, because they haven't been tested as thoroughly as Mac OSX's set of libs...


I'm not sure I follow your suggestion of "package it into your app/lib yourself". Are you suggesting manually installing stuff that isn't available from Macports or Fink? Or something else? For the record I'm pretty green when it comes to Unix sysadmin stuff so go easy...

I'm genuinely interested - I'm a new Mac user who came from Linux and my biggest issue on the Mac so far is that package management isn't as good as apt. There was a discussion on HN and I was disappointed to find a lot of other people have had the same feeling.


You should check out Homebrew for Mac. I've found it a little bit more reliable than Macports, and it's actually reasonably easy to create your own packages. To create a new package you can just `brew create http://foo.org/crufty-package.tar.gz` and 90% of the time the generated recipe just works and the other 10% it requires you to tweak the (quite readable) ruby file.

Also, while Homebrew has an official repo, it's managed through git and hosted on Github. So for any given piece of software, chances are pretty good that someone out there has packaged it.


For what it's worth, I'm an ex-Mac user who went back to Linux because I missed apt. Of course, I do a lot of hobby programming and general messing around with open-source software; if your use-cases tend more towards pre-packaged software, you might get on perfectly well.


I do sort of both by running Debian inside Virtualbox on my Macbook as a dev environment. I especially like being able to install -dev packages to get header files/libraries into the right places, whereas on OSX, if you want to compile software that depends on anything that didn't come with XCode, it's a mess of manually extracting things and passing pathnames.

I don't go the whole way and just install Debian on the machine mostly because having OSX handle things like the touchpad driver, hibernation, wifi, etc., is less hassle, and I also like OSX as a desktop environment (e.g. being able to use multitouch gestures in the browser). But Linux in a VM is definitely a nicer experience imo than fink was, as a way of getting a parallel Unix CLI userland onto OSX.


> I don't go the whole way and just install Debian on the machine mostly because having OSX handle things like the touchpad driver, hibernation, wifi, etc., is less hassle

Have you tried this recently, or do you just say that because it's the stereotype? Because it's been over three years since I had any hardware trouble with Ubuntu. These days I have more issues on my friend's OS X Macbook (no drivers for 3G modems out of the box) than I do on Ubuntu where everything just works.


I definitely haven't written off switching back to Linux - if I ever do it will probably be because of apt (and maybe xmonad). For now, installing the vanilla versions of stuff like ruby 1.9, SBCL, mit-scheme from Macports, and pre-packaged versions of MacVim and emacs are working for me but that may not always be the case...


Homebrew is the best, but I always end up building everything I need from scratch, because I'm paranoid that some idiot package maintainer will require some tiny library that will necessitate building and installing X11, which software is largely the reason I use a goddamn Macintosh in the first place.

Also, Interface Builder.


MacPorts can be hard to grep if you haven't used a BSD before (not that it's great anyway), and Fink is a bit more painful than apt et al.

If you're on OSX, the ace up it's sleeve is Homebrew[1]. Definitely the easiest package management system anywhere.

[1]: http://github.com/mxcl/homebrew


I'd forgotten about Homebrew but found it again when I went hunting for the earlier discussion on HN... I'll definitely be checking it out soon, thanks.


fedora- are you talking about rpm? Yes, this is a nightmare- package conflicts are common-place, and generally a real b*tch to fix. RPM was the main reason I ditched Fedora and switched to Ubuntu. In 3 years I don't think I've ever had an install failure or conflict or any problem with apt-get. It's always worked flawlessly.


http://ninite.com is getting close to apt-get for Windows. We're working on a Mac version too.

Things are better on Linux, but not all rosy. Installing Dropbox, for example, is a surprisingly crappy experience (windows popping up, separate proprietary daemon download, etc). That, and way more user requests than you'd imagine, is why we have http://ninite.com/linux now too.


Certainly looks promising. For some reason it reminds me of Portable Apps.

As for Dropbox, yes, I recently went through installing it. Terrible experience on Ubuntu. I think the reason is mostly that it is using a proprietary daemon. A FUSE driver seems like a much better idea, and if they released it under an open license, it would be as easy as apt-get install dropbox.


I've had nothing but a completely smooth experience with Dropbox on Ubuntu. YMMV?


I tried to set up Dropbox without any sort of a GUI. I know that their .deb package is partially an extension to Nautilus, and since I run Xubuntu I don't have that installed. Instead I followed the poorly written instructions at http://wiki.dropbox.com/TipsAndTricks/TextBasedLinuxInstall which resulted in a .py file being invoked via ~/.profile which still creates a GUI window to ask you for your information, etc. Why not have a standard daemon with a ready-made init.d script rather than this madness?


I use ninite all the time (it's brilliant!), but only ever for initial installation. Does it handle updating apps?


Yes, just run the installer again (they always get the latest versions, regardless of when they were made). We're looking into building some sort of dedicated update checker too.

Glad to hear you like it.


OS X has a package manager, it just isn't installed by default. Checkout MacPorts or Homebrew.


Yes, I am aware of these projects. They, however, don't have the advantages that things like apt have. For example you cannot use them to update your entire OS. Instead you must order DVD's from Apple, and then boot from them. Yes, installing certain software this way is easier, but it is not nearly as convenient as having the entire OS built around it.

On the other hand, existence of these projects indicates that Mac users are dissatisfied with how installing/updating/maintaining software is done in Mac OS X and are looking for something better.

Lastly, I don't necessarily think that apt is end-all be-all silver bullet answer to all questions. I just think that out of the available technologies today, OS's that are built around package management work better and are easier to use than anything else out there.


The existence of these projects is necessitated because users want to use unix-y programs but they don't know how to compile themselves. I don't use either of homebrew or macports anymore. I used to have them but then when I want to make something that depends on something I got from homebrew or macports it doesn't quite work and I end up downloading the source and compiling it into a framework using XCode myself anyway.

I've done this so many times it doesn't take me more than a few minutes to get all the dependencies in order. (I can re-use frameworks I've made before). The frameworks are embedded into the application bundle and can be used on any other computer that can run universal binaries. That makes the app portable, too. The beauty of all this is that other computers don't even need a compiler installed and they can still use the same programs I built. Of course it'll never beat apt... but it's the easiest in terms of getting that DVD to ipod video converter onto my aunt's computer.

I believe you're right. If you want package management the OS has to be built around it, otherwise it won't work quite right (unless you put time into it, of course).

I still will use macosx though, it needs less configuration and everything just works; or maybe I'm just too used to it to use anything else. Mac OS X is built for consumers and developers who develop for consumers who use Apple products. For these consumers and developers Mac OS X is the best there is. Unfortunately I think that's it. Terminal is still good, though.

To upgrade your OS, you only need to click "Restart" when software update tells you to. The only time you need the DVDs is when there is a major version out, i.e "Leopard", "Snow Leopard", etc.


Bah, it's not really that easy. I remember having an annoyingly difficult time getting a working up-to-date Glade on the latest LTS version. Sure, it worked on the latest Ubuntu just fine, but shouldn't we able to have some popular applications up-to-date without a big mess? But yeah, in general its pretty great.


I've been using Ubuntu since 6.04, longer than any other Linux distribution (one or the other has been my main desktop since 1999). I have to say, though that of course, Ubuntu wouldn't exist as it is without the grandeur of Debian - thanks, Debian.


You probably meant 6.06. That was my first Ubuntu as well, though it was also my first Linux distribution.

http://en.wikipedia.org/wiki/List_of_Ubuntu_releases


Right, 6 something is all I remember at this point! Huh... looking at that list, it might have been even earlier, because I remember Breezy Badger and Hoary Hedgehog. I recall Dapper Drake being a particularly good release, so you jumped in at a good time!

So, I guess actually 5.04 was when I moved over from Debian. I think it was before Debian Sarge was released, which apparently was 2 months before Hoary Hedgehog, so that makes sense.


"it runs on every piece of hardware you throw at it"

That's what continues to impress me about Ubuntu - I've installed it on everything from PPC-era Macs to dodgy little netbooks and it invariably 'just works'.


There was a time when you had to carefully choose your hardware to work with Linux. These days, you have to carefully choose your hardware in order to experience problems.

I am surprised by how many people do the latter.


If 95% of components work on Ubuntu, and your computer life involves 25 components (start counting, you'll hit it pretty quickly), Ubuntu is totally broken for about 75% or so of users. If Mom's camera or wireless doesn't work, she does not perceive the computer as being mostly functional.


I actually did install Ubuntu on my mother's system and haven't had any such gadget shortcomings.

What kind of life of bizarre peripheral addiction do your loved ones live? I can count for myself wired USB headphones, a mouse, a printer, speakers, wireless headphones, a camera (if putting the card into the built in card reader counts), an iPod and... and.. well, that's only seven. They all work with no problem on Linux. What on earth are the other 18 that the average person uses?

We did have a problem with a USB GPS device, but overall the computer is actually lot more functional compared to back when I had to go over and uninstall a new virus and spyware suite every 30-40 days. My mother seems to like Ubuntu a lot, and when I took the computer the other day, she was complaining about using my Father's windows laptop. It was pretty classic, she wrote and said she missed her computer with the 'mozilla fox operating system.


I think the 25 component number is a bit of an exaggeration. Also, you assume a perfectly even distribution among component pairs and that's not the way computers are designed these days. You won't find an AMD box with Intel GMA graphics, for instance.


That said, if 95% of all components work on Ubuntu, maybe 99% of common components work on it, maybe 99.99% of things Dell sells work on it. You're still stuck by that $15 webcam Mom bought at Walmart, though.


all those components come with drivers provider by the vendor of that component for windows(mostly) machine only. Linux doing a great job catching on all those components but the thing is linux is mostly behind. if Vendors helps linux community a little we should be going hand in hand.


Every vendor publishes their own buggy version of drivers for what is essentially the same hardware, all neatly packaged with some buggy "value added" crapware that introduces new and unpredictable behaviour mom has to deal with.

No. I really prefer the Linux way.


I don't think it takes that much effort to have trouble with suspend/hibernate. I've found it to be the rule rather than the exception.


True, that used to be a big problem for me as well until about 1 year back. I think it was Karmic that finally solved it on all my laptops.


Suspend works perfectly for me on a Dell Studio XPS, an EeePC, and a MacBook.


What kind of notebook computer have you been using?


On this Dell Mini 10, X will eventually crash following a resume from hibernate, then I drop to low-graphics mode and avoid doing an actual reboot for as long as possible, just hibernating my low-graphics state.


The only bit of hardware I've had problems with recently is Broadcom wireless cards. Worse Drivers ever. I still won't ever buy one intentionally and when I find one will replace it with an intel or something else that works with out a giant hassle.


Nevertheless, I'd advice reading the release notes before installing. My mom's laptop happened to have the precise laptop configuration that Lucid Lynx release notes warned I'd have trouble with ... and I hadn't read it :)

My mom's been using Ubuntu! 'Nuff said I think.


Mine too! I just reinstalled 10.04 for her on a random old Dell laptop yesterday, as we'd been upgrading all the way since 8.04. It has been so much more trouble-free than XP for her.

I was having trouble configuring the wireless, I thought... then I noticed my Mac wouldn't connect, either! Rebooted the router, and indeed everything Ubuntu installed worked out of the box.


Not all hardware. I couldn't even get the install disc for 10.04 to boot on my 2010 13" MacBook Pro. I'll give it another try when 10.10 comes out, but Ubintu definitely does not "just work" on all hardware.


Well the way OSS drivers developpment works means that you're far more likely to experience problems with new hardware than with old hardware. That, plus the mac pro, even if it is an intel based machine, is the most closed hardware you could find on this platform. So, "couldn't even" may not be the right choice of word for your situation .


I wasn't expecting it to work 100% flawlessly out of the box. But not even able to boot? I feel sorry for Canonical as 10.04 is an LTS release, so they're now stuck with this rather glaring incompatibility for years.


LTS versions have stable patch-level releases every 6 months while they're supported (e.g. 10.04.1 was released in July, and there will probably be a 10.04.2 in January), so they're able to add drivers to the installer when they become available.


Installing Ubuntu on a MacBook feels wrong. It was designed to run MacOS X. Even the chassis finish matches the window chrome...

In my interesting computer collection, I have a couple IBM PPC/POWER machines. When I acquired the first one, it took me some work to procure a set of AIX disks (DDS tapes would do) that would install on the machine. Several friends of mine suggested I should install Debian and be done with it, but I felt that would make them less interesting and opposed it. When I finally installed AIX, I felt they were complete, with the system they were designed to run, it all its CDE glory.

Last time I plugged a Mac keyboard on my Linux netbook, not even the function keys worked. It muted the speaker when I tried a search...


Considering they have problems with two of the three graphics card vendors and wubi doesn't work if you have more than one hard disk partition, just works isn't how I would describe it. Good when it does and mostly works pretty much describes it.


Not every hardware. There's still problems from time to time but in 95% of cases it works. And that's still better than having to deal with Windows and its problems.


Odd, I regularly see complaints here on HN of this or that not working, and the response seems to often be that you have to pay attention to what hardware you buy and you can't expect everything to work... often with a snarky comment about OSX only being made for Apple hardware.


The new ubuntus do not work with intel graphics cards. I do not want to go through workarounds just to get rid of the screen freeze when it boots. I used Ubuntu for like 7.04 but after 8.10 I think the gfx started to break and by 9.10 they are totally unworkable, I have to restart (hard) several times to start xorg.


I would rephrase that "[some] Intel graphics cards do not work with the new Ubuntus". The problem is that Intel has chosen to prevent Ubuntu from supporting those graphics cards, not that Ubuntu cannot support them and not that Ubuntu choses to not support them.

In the cases that I looked at, Intel licensed graphics "IP" from third parties that had severe restrictions on what technical information could be published openly, making it incompatible with GPL.


I am using the same computer which used to work with 7.04. How come something that was working in 7.04 suddenly break in 8.10?


You don't say what Intel graphics chip you have problems with, but your problem matches the GMA 500.

http://en.wikipedia.org/wiki/Intel_GMA#GMA_500_on_Linux

"GMA 500 support on Linux is not optimal. The driver is developed by Tungsten Graphics, not by Intel, and the graphic core is not an Intel one, but is licensed from PowerVR. This has led to an uncertain mix of open and closed source 3d accelerated drivers, instability and lack of support."

Read the rest of the section, it gives some hope that the support has improved since 2008.


It is just the onboard graphics that used to come with intel boards. I used the i810 driver with it as far as I can remember.


Ahh... Ubuntu. I've been using Linux since pretty much forever and I've gone through Debian, Mandrake, Gentoo, Slackware, you name it.

Sure, Ubuntu has its fair share of idiosyncrasies, not to mention idiotic bits and a... peculiar developer community, to say the least, but on the whole it manages to combine the flexibility and openness of Linux with some of the nicer aspects of closed platforms like OS X.

Bring on 10.10 and beyond!


I started using Ubuntu on my desktop when I started developing for the web and because my web-server ran ubuntu. Love the fact that I can have a server-like OS on my desktop, that also does pretty much does everything that windows and macs do


Both Windows and Mac OS X have server versions, that can easily be installed on desktops.


...neither of which come with the normal versions (and, indeed, are quite a bit more expensive), so most people will never even touch them on their PCs.


OS X comes preinstalled with Apache. No need for OS X server.


Although there is something to be said for running the same software stack on your desktop as your server.

Though, I'll grant you, web development on OS X is really nice.


Most ppl haven't touched Linux on their PC either ;-)

There is a 6 month trial of Windows Server http://www.microsoft.com/windowsserver2008/en/us/trial-softw...

Students can get a Windows Server licence for $19 http://blogs.msdn.com/b/gautam/archive/2010/02/18/msdnaa-sof...


Seriously? Would you run a web-app on a Mac or Windows server?


Something like 30% of the Web servers on the Internet run Windows. There are millions of ASP.NET developers worldwide and Windows is by far the most popular Web platform for internal business apps.


I've been having a lot of problems with Ubuntu. First of all, wireless on Ubuntu sucks. I can't be bothered reading tons of articles on how to fix wireless, i just want it to work like it does on Windows. Ubuntu doesn't detect a few of my webcams. I can't even install Ubuntu 10.04 on my new laptop (though it seems they're busy fixing it), so i'm stuck with 9.04. I'm also having some trouble with Chrome on Ubuntu, but i should probably blame Chrome for that. Rebooting on Ubuntu doesn't even work. I can only shut it down. I could probably fix it in a timely fashion, but i don't see why it broke in the first place. I've never had any trouble with Windows of this kind.

Anyway, It's still a great OS, but there's no way the average computer user is going to be satisfied with Ubuntu unless the developers somehow fix these little annoyances.


There's no way the "average computer user" is going to be satisfied with an operating system that doesn't come preinstalled on their computer, that isn't officially supported by the manufacturer of their computer, and that they consequently have to deal with the various possible technical problems of.

Non-technical users don't install operating systems.

Ubuntu is the closest candidate to fixing the structural problem of "very few computers come preinstalled with a supported, free-as-in-freedom operating system". Until that problem is fixed, and as long as people are ending up having to install an operating system on arbitrary computers with millions of different hardware combinations, such annoyances will continue at some level.


>Non-technical users don't install operating systems.

Look, this is probably generally true - however, the last bare metal Ubuntu install I did was as simply as insert disc, click the install prompts, accept all defaults and your done. Yeah, it took an hour or two all told.

It certainly has been quite different in the past but any user that can watch a DVD on their computer can install an [selected] OS if they want to.

Hardware compatibility is another question. If the manufacturers would just test and have a "Ubuntu ready" label then that issue would be largely moot, at least as much so as with MS Windows.


I do agree that unless you're particularly unlucky hardware-wise, an Ubuntu installation is more likely to succeed nowadays than fail. But they may be able to survive the installation, yet end up with minor to moderate problems that degrade the entire experience, which wouldn't be there, had their operating system been preinstalled, tested and supported with their hardware.

The question of hardware compatibility and other technicalities aside, the entire notion of the "operating system" being decoupled from the computer and installable separately is alien to most non-technical computer users, and the idea of installing a "third party" operating system is simply uninteresting; which is why I said "don't", not "can't". Why should they, when they already bought a computer with an "operating system", and it largely works?

My point is that the "download an .iso file from the internet, burn a CD / flash disk and install something called an operating system" model, while it's largely successful for the audience it's intended for today, is not way to advance the use of free operating systems in the broader computing ecosystem; striving towards more viable and widespread Linux-preinstalled devices is. But surely the success of the former will reinforce the latter, since a critical mass is needed.


The drivers problems are a manufacturer problem, not Ubuntu's. Complain against the manufacturer, so that they know that people actually want linux drivers. Vote with your wallet, buy linux compatible hardware. Don't blame Ubuntu for others' mistakes.


That is correct. No hardware will work with every OS. Certain hardware will support certain versions of Ubuntu though. It makes life a lot easier and saves an enormous amount of time if you research what the hardware will support first, and then use the correct hardware and OS combinations. With an unsupported combination sometimes it will look like everything is ok but you will have intermittent problems such as USB ports not working sometimes, or other issues. With the correct combination of hardware Ubuntu will work fine.


Sure, i might have unnecessarily blamed Ubuntu, but it's still a problem that needs to be fixed. And as long as that problem exists, i won't recommend Ubuntu to the average computer user.


Why? The "average computer user" is precisely the kind of person who is likely to have commodity hardware that is supported by Ubuntu. Would you not recommend it because you can't guarantee that Ubuntu will work with all her hardware? If that's the problem, then what would you recommend? Can you guarantee that some other OS supports every piece of hardware she owns? Or are you holding Ubuntu to a higher standard?

If the answer is simply that you believe another OS will support more (though maybe not all) of her hardware, why not at least recommend that she get a Live CD, try it and see? Then she can weigh for herself the benefits of a (mostly) Free software OS against the extra cost of getting her hardware to work, instead of just accepting your guess that it's not worth it.


Because the average user just wants his computer to work, he doesn't want to try something unfamiliar and risk doing even more work to get the computer working. Maybe we don't have the same definition for "average user".


Hardware can be expected to work on Windows. If they don't, it's the hardware manufacturer's fault. There might be nothing you can do about it, but it's their fault. They manufacturers are expected to support Windows. If hardware doesn't work on Ubuntu, it's your fault for installing an unsupported system.


> Hardware can be expected to work on Windows. If they don't, it's the hardware manufacturer's fault.

Sorry, this is an unreasonable expectation, and one that should be corrected, not reinforced. The "average user" has no right to expect that an operating system should work with any and all hardware, simply because one operating system has a monopoly position with respect to users and a monopsony-ish position with respect to hardware makers. It hurts all computer users, and Free software users in particular, when we allow such companies to set our expectations.


Who is this "average computer user"? Anybody that makes the jump to Linux has a reason to(and there many, not just tinkerers), otherwise they stick with a windows machine or pony up for a mac.


It's weird. I used to love Ubuntu, but after 8.04 I just stopped liking it. The little things niggled away at me. The bloat, the fear of upgrades breaking things (this is why my main home file server still runs 8.04, something I'm not happy about) if you have a slightly nonstandard install. I switched to Arch for my main home laptop and breathed new life into a PIII-850 with 128mb of RAM. Ubuntu would've killed it.

In a bizarre twist of fate, work requirements meant having to use Windows Vista (the horror!) then 7, where so far the best feature seems to be "It's not vista", so no more Ubuntu at work for me, except through VMs.

Still, Ubuntu did a heck of a lot to make Linux easier for the masses, and I have to say thanks to all those at Canonical and the Ubuntu community that made it possible.


Weird. Its since 8.04 and onwards that upgrades STOPPED breaking things for me. Its been a golden era for Ubuntu.


It seems like since 8.04 they've changed the package naming conventions for LaTeX twice. That's been the biggest pain about upgrading for me, everything else has been a breeze.


I think they inherited that transition from Debian, where it's also been a bit of a pain. Part of the mess is that it wasn't just repackaging/renaming, but a switch of upstream sources too, which weren't exact drop-in replacements for each other--- teTeX hasn't been actively maintained since 2006, so Debian (and most other distros) shipped to TeX Live as their new upstream.


Yeah, I set up disk encryption for all my partitions except for /boot which If I remember from the install (6.06 LTS) wasn't a standard option at the time, upgrading to 8.04 caused me a whole load of grief.

I know I should rebuild it and it'll probably be fine, but if I was going to rebuild I think I'd use Arch instead (just a personal choice).

But the VMs of Ubuntu that I use of newer versions, yeah they're great.


I run AwesomeWM on top of minimal Ubuntu install, and I enjoy the best of both worlds. I get access to the whole .deb racket while at the same time getting rid of all the bloat. But you're completely right about upgrades breaking things. Since Jaunty, I can't remember a single upgrade which didn't make me thank god for Dropbox.


XMonad also works fine.


My workplace experience is that most of these so called "average Windows users" don't even understand the Windows environment to a functional level and need hand holding to use basic features like Remote Desktop, create a mapped drive, or navigate the file system. They don't have any interest in learning more about computers, they just absorb the minimum possible computer skills to perform required office work. They don't even really understand the office tools they use everyday like Microsoft Office. As ignorant as they are about Windows they are more than willing to jump ship to an unfamiliar Apple computer as they have heard it is even more simple to use. These "office workers" know how to click(oops), then double click the big motha icon to open the data entry application, and type away all day, then leave the application open so they don't have to open it again. This stuff is going to simplify down to something like a fast food restaurant cash register with food icons on the keys.


I like Ubuntu quite a bit, I just wish they would offer a Base install option. I like their hardware detection, Gnome modifications, and font smoothing, but I am not a fan of the Ubuntu One stuff or its kitchen sink approach to including software.

Basically, I just want Ubuntu's Base Gnome package, so that from there I can add want I want.


Try the gnome-stracciatella-session package, or choose the "Install a command-line system" option in the alternate installer and add what you like later.


I'm so sad that so many HNers can not understand what is the real value added by Ubuntu. If more hardware is supported, it's because the kernel tends to get drivers for most hardware nowadays. If apps are getting better, it's not because of Ubuntu, but because of the apps developers. The power of Ubuntu is only on marketting.


I kind of agree with this. The article is not really about Ubuntu, it is about Linux distros in general. I have been using Mandrake/Mandriva for at least eight years. I tried Ubuntu a while ago to see if the buzz was justified. I was unconvinced. The "user-friendliness" was lower than Mandriva. But it seems Mandriva does not have the mind share that Ubuntu seems to have captured.

On the other hand, Ubuntu's success is success for Linux in general, so I have nothing against it. It just surprises me.


Ubuntu is excellent, except:

1. Firefox and Chrome are very slow and freezing, when opening many tabs.

2. Flash plugin is crashing constantly

3. OpenOffice is very slow and freezing. Can't render properly MS Office documents.


> 1. Firefox and Chrome are very slow and freezing, when opening many tabs.

> 2. Flash plugin is crashing constantly

Try that without the flash plugin.

> 3. OpenOffice is very slow and freezing. Can't render properly MS Office documents.

I find it more likely that Office can't save its files according to whatever is the current Office file spec.


That may be the case, but I bet Office can handle those documents just fine. The non-techy user doesn't care why something doesn't work. They just care that it doesn't work.


1,2. Try using chromium and ff4 (minefield for 64 bits) 3. These are known OO problems, version 3.2 is a lot better.


I recommend the nightly chromium dev build if you can do it.


i am considering moving to fedora from ubuntu. one of my motivations is descriptions of better user experiences overall, plus more cutting edge stuff like systemd. morever i think i am growing more and more colored by the notion thar ubuntu doesnt really employ a significant number of core developers.

does anyone have experience of the differences in the daily experience of using fedora vs ubuntu, especially for a clean simple experience, as a developer machine setup?


I'm a Fedora contributor and closet Ubuntu user. While I can get extremely upset at how difficult things can be in Fedora, like python and ruby, I generally try and take care of these things myself anyway. And who's not using RVM so why does it matter in the first place?

Truth is, I spend way too much time hacking at my Fedora system to make things work. It gets frustrating. I fall back to the Ubuntu systems and somehow, it's just easier. I'd comment on the details here, but it is madness.

So why even bother with Fedora at all? People. I can't stand Ubuntu people. From uneducated bloggers to Ubuntu fanbois, it drives me insane. You know there's a popular blog called OMG Ubuntu!? As far as uneducated views, there's a lot of talk about how awesome NetworkManager is in Ubuntu... you know where all that awesomeness came from? Fedora!

The Fedora community? Just amazing talented people all over the place. I can't stand to give them up. Some of the most awesome, unique, thoughtful and mindful people I've ever met all have Fedora in common.

In the end, I can't see how much feature X has to do with anything over it not existing in another distro. I can sort of see how something as shiny and new as a new Ubuntu release can be appealing, but not enough to move me. It all comes down to people, and Fedora has what really matters here.


I share your absolute hatred of the Ubuntu user community — from inane theming to blogspam with <pre> elements that contain huge blocks of sudo apt-get install…


I love ubuntu! Although Lucid has done something that drives me nuts. The devs decided it would be nice to add a Gnome login to Netbook-Remix Sounds nice, but they decided limit a bunch of features in the old interface so it would not conflict. I love you ubuntu but you need to stop taking away my power!


Yup, Ubuntu is awesome. After installing it for the first time, I never went back to other distros for my GNU/Linux desktops.

Too bad that Canonical is unsustainable, and we'll soon lose it if FOSS enthusiasts don't shift their mindset a little bit.


regarding all rants in here, I simply cannot understand how could people complain so much about such a good product given to them for free.


I love ubuntu as much as the next guy. I run all my servers with ubuntu.

But to say that Mac OS X is not as usable as ubuntu is blatantly wrong.


I too used to think like you do. But I've been using Ubuntu at work, and now when I go on a Mac I actually miss the usability features of Ubuntu.

Just a tiny example: hitting alt and clicking anywhere on a window to move it instead of going to that top bar. Or the multiple different ways to switch between apps. etc. etc. There's probably 100 small things that I now can't live without. So I would say the statement "Mac Os X is not as usable as Ubuntu" is actually true. Surprisingly.


I would agree with this. At my new job I was forced to use a Mac for my first couple of weeks and it was like being put in a straight-jacket. A lot of the gui can be manipulated from the keyboard in gnome, and this is what I missed the most. If Mac had alternatives I'd be happy to use them, but they weren't immediately obvious after a google or two.

Conversely if you're a 'mouse person' then this may not be an issue for you.


For me, the best thing about Ubuntu is that they really care and more importantly, invest, into the UI experience.


I ♥ Ubuntu, and have been since 7.04


Love you, Ubuntu.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: