Packages tend to thrust some completely idiosyncratic configuration scheme down my throat that is documented nowhere and that creates conflicts of its own. They don't just solve dependencies they encourage dependencies because they are supposed to be resolved automatically anyway. But they're not. Things break all the time. The Linux dependency hell has far surpassed the infamous Windows DLL hell of the 90s.
I was called to help people fix their Ubuntu installation after some package upgrade had uninstalled half the system. I had an upgrade replace a working but proprietary wifi driver with a non functional "free" one, etc, etc.
Most of the problems I ever had with Ubuntu were caused by that completely broken package management idea. If I want Python 2 and 3 on the same system I can't have a package manager tell me that's not supposed to be when I know full well that it's no problem.
This is a good thing. Developers' attitude toward dependencies gets very lax when it's too much work to declare them or too much work for the user to satisfy them. Then you get into situations where you install a piece of software and 90% of the functionality is broken, because the developer was "considerate" enough not to "force" a bunch of dependencies on you, so you resort to Google to find the real dependencies, which of course aren't available for your distro. Apt gives developers a flexible way to declare different kinds of dependencies, and the Debian and Ubuntu distro repositories give them confidence that the dependencies will install smoothly for users.
I had much more difficulty with undeclared dependencies back in the day than I have with excessive dependencies now. It's just wrong for software to easily install in a mostly broken state. If you want broken, partly-working software, you should have to do extra work to achieve that, not the other way around.
Package management is a solution to a problem that doesn't need to exist in the first place.
1. Upgrading the entire OS at once.
2. Easy updates.
3. Almost never having any issues with incompatible software. When issues arise I can easily roll things back or wait until they are fixed.
4. Never having to hunt for software on the web.
5. Ability to script server/workstation deploys.
6. Never having to spend more than 5 minutes on maintaining my workstation.
If there is an alternative way of approaching it I'd be very happy, since it can only be an improvement upon already stellar apt/yum/BSD ports, etc.
But if the applications are all statically-linked, I don't have to worry about dependency conflicts, and some things are a lot simpler, e.g. copying the binaries off to a flash drive to take them to another system, or running two different versions of the same software simultaneously.
On Windows, I don't usually even run installers anymotre; I extract the installers, collect all of the dlls into the application directory, and compress everything with UPX. Once the program is running out of a self-contained directory, I can run it off a thumbdrive, copy it to my Dropbox, etc. Plus I have the option of not updating software when I prefer not to.
The only potential problem I see is statically-linked applications issue. However, I think in the case where libraries are open and shared amongst many applications/other libraries, it makes no sense to use static linking. Its advantages are that you don't need to worry about dependencies but that "is a solution to a problem that doesn't need to exist in the first place" since package management takes care of these issues.
IMHO it's better to spend 30 minutes once a year resolving an issue that arose when a Debian/Ubuntu repository got screwed up for a short period of time, than to constantly try to solve the problem of managing software installation/upgrades.
However, this comes at a price: You'll always have to provide a newer version of the application whenever a serious issue is fixed in one of your dependency libraries. This means extra work for you. Also, you'll have to take care of an auto-upgrade mechanism.
On systems with a package manager, you'll only have to provide updates if you fix an issue in the application itself. You also don't have to write an auto-upgrade mechanism. Just upload a new package to your server.
However, if you don't care that much about users' security, statically linking is indeed a lot less work than using a package manager.
(FWIW, I'm the maintainer of Mingw-cross-env (http://mingw-cross-env.nongnu.org/). This is a project that provides a cross-build toolchain for statically linked Windows applications. It builds a cross compiler and lots of free software libraries. We have that kind of discussion from time to time on our mailing list.)
Also, you might like http://sta.li/ ;)
Thanks for the link to sta.li, I'll check it out. What I'd really like to see is a Linux distro that managed applications the way OSX does - every program is contained in its own directory.
That exists, it's called Gobo Linux. I've never touched it, but you can give it a try:
You can; it's called "pinning:" https://help.ubuntu.com/community/PinningHowto
If I want Python 2 and 3 on the same system I can't have a package manager tell me that's not supposed to be when I know full well that it's no problem.
Let me check; here on Jaunty I can separately install python2.4, python2.5, python2.6, python3.0, & python3.1.
I don't doubt that you've had problems with the package managers, but, um, I think you're doing it wrong.
Installing Python 3 alongside the old version didn't work for a long time after Python 3 came out. If it works now, that's fine. I compiled my own Python 3 a long time before it finally appeared in my system's package manager.
[Edit] And one more thing. I'm not a sysadmin and my interest in system configuration and change management is very limited. But I've been using Linux for a very long time, so if I'm still doing it wrong then many people will do be doing it wrong and that means there's something wrong with the way it works.
You really should give Slackware a try. None of that sophisticated package management nuisance and everything just works!
you're doing it wrong
In case you hadn't noticed, I'm a bit bitter about my fedora install at the moment :)
That said, I think you overestimate the delta between OSX and the various linuxes. On the Mac, as with linux, if you want to use a package that isn't part of the core system, you should package it into your app/lib yourself. The big advantage that the linuxes have is that they have a far bigger range of packages in the repository than you have in a mac's standard install. But by the same token, you can't have as much confidence that all the libs play well together, because they haven't been tested as thoroughly as Mac OSX's set of libs...
I'm genuinely interested - I'm a new Mac user who came from Linux and my biggest issue on the Mac so far is that package management isn't as good as apt. There was a discussion on HN and I was disappointed to find a lot of other people have had the same feeling.
Also, while Homebrew has an official repo, it's managed through git and hosted on Github. So for any given piece of software, chances are pretty good that someone out there has packaged it.
I don't go the whole way and just install Debian on the machine mostly because having OSX handle things like the touchpad driver, hibernation, wifi, etc., is less hassle, and I also like OSX as a desktop environment (e.g. being able to use multitouch gestures in the browser). But Linux in a VM is definitely a nicer experience imo than fink was, as a way of getting a parallel Unix CLI userland onto OSX.
Have you tried this recently, or do you just say that because it's the stereotype? Because it's been over three years since I had any hardware trouble with Ubuntu. These days I have more issues on my friend's OS X Macbook (no drivers for 3G modems out of the box) than I do on Ubuntu where everything just works.
Also, Interface Builder.
If you're on OSX, the ace up it's sleeve is Homebrew. Definitely the easiest package management system anywhere.
Things are better on Linux, but not all rosy. Installing Dropbox, for example, is a surprisingly crappy experience (windows popping up, separate proprietary daemon download, etc). That, and way more user requests than you'd imagine, is why we have http://ninite.com/linux now too.
As for Dropbox, yes, I recently went through installing it. Terrible experience on Ubuntu. I think the reason is mostly that it is using a proprietary daemon. A FUSE driver seems like a much better idea, and if they released it under an open license, it would be as easy as apt-get install dropbox.
Glad to hear you like it.
On the other hand, existence of these projects indicates that Mac users are dissatisfied with how installing/updating/maintaining software is done in Mac OS X and are looking for something better.
Lastly, I don't necessarily think that apt is end-all be-all silver bullet answer to all questions. I just think that out of the available technologies today, OS's that are built around package management work better and are easier to use than anything else out there.
I've done this so many times it doesn't take me more than a few minutes to get all the dependencies in order. (I can re-use frameworks I've made before). The frameworks are embedded into the application bundle and can be used on any other computer that can run universal binaries. That makes the app portable, too. The beauty of all this is that other computers don't even need a compiler installed and they can still use the same programs I built. Of course it'll never beat apt... but it's the easiest in terms of getting that DVD to ipod video converter onto my aunt's computer.
I believe you're right. If you want package management the OS has to be built around it, otherwise it won't work quite right (unless you put time into it, of course).
I still will use macosx though, it needs less configuration and everything just works; or maybe I'm just too used to it to use anything else. Mac OS X is built for consumers and developers who develop for consumers who use Apple products. For these consumers and developers Mac OS X is the best there is. Unfortunately I think that's it. Terminal is still good, though.
To upgrade your OS, you only need to click "Restart" when software update tells you to. The only time you need the DVDs is when there is a major version out, i.e "Leopard", "Snow Leopard", etc.
So, I guess actually 5.04 was when I moved over from Debian. I think it was before Debian Sarge was released, which apparently was 2 months before Hoary Hedgehog, so that makes sense.
That's what continues to impress me about Ubuntu - I've installed it on everything from PPC-era Macs to dodgy little netbooks and it invariably 'just works'.
I am surprised by how many people do the latter.
What kind of life of bizarre peripheral addiction do your loved ones live? I can count for myself wired USB headphones, a mouse, a printer, speakers, wireless headphones, a camera (if putting the card into the built in card reader counts), an iPod and... and.. well, that's only seven. They all work with no problem on Linux. What on earth are the other 18 that the average person uses?
We did have a problem with a USB GPS device, but overall the computer is actually lot more functional compared to back when I had to go over and uninstall a new virus and spyware suite every 30-40 days. My mother seems to like Ubuntu a lot, and when I took the computer the other day, she was complaining about using my Father's windows laptop. It was pretty classic, she wrote and said she missed her computer with the 'mozilla fox operating system.
No. I really prefer the Linux way.
My mom's been using Ubuntu! 'Nuff said I think.
I was having trouble configuring the wireless, I thought... then I noticed my Mac wouldn't connect, either! Rebooted the router, and indeed everything Ubuntu installed worked out of the box.
In my interesting computer collection, I have a couple IBM PPC/POWER machines. When I acquired the first one, it took me some work to procure a set of AIX disks (DDS tapes would do) that would install on the machine. Several friends of mine suggested I should install Debian and be done with it, but I felt that would make them less interesting and opposed it. When I finally installed AIX, I felt they were complete, with the system they were designed to run, it all its CDE glory.
Last time I plugged a Mac keyboard on my Linux netbook, not even the function keys worked. It muted the speaker when I tried a search...
In the cases that I looked at, Intel licensed graphics "IP" from third parties that had severe restrictions on what technical information could be published openly, making it incompatible with GPL.
"GMA 500 support on Linux is not optimal. The driver is developed by Tungsten Graphics, not by Intel, and the graphic core is not an Intel one, but is licensed from PowerVR. This has led to an uncertain mix of open and closed source 3d accelerated drivers, instability and lack of support."
Read the rest of the section, it gives some hope that the support has improved since 2008.
Sure, Ubuntu has its fair share of idiosyncrasies, not to mention idiotic bits and a... peculiar developer community, to say the least, but on the whole it manages to combine the flexibility and openness of Linux with some of the nicer aspects of closed platforms like OS X.
Bring on 10.10 and beyond!
Though, I'll grant you, web development on OS X is really nice.
There is a 6 month trial of Windows Server http://www.microsoft.com/windowsserver2008/en/us/trial-softw...
Students can get a Windows Server licence for $19 http://blogs.msdn.com/b/gautam/archive/2010/02/18/msdnaa-sof...
Anyway, It's still a great OS, but there's no way the average computer user is going to be satisfied with Ubuntu unless the developers somehow fix these little annoyances.
Non-technical users don't install operating systems.
Ubuntu is the closest candidate to fixing the structural problem of "very few computers come preinstalled with a supported, free-as-in-freedom operating system". Until that problem is fixed, and as long as people are ending up having to install an operating system on arbitrary computers with millions of different hardware combinations, such annoyances will continue at some level.
Look, this is probably generally true - however, the last bare metal Ubuntu install I did was as simply as insert disc, click the install prompts, accept all defaults and your done. Yeah, it took an hour or two all told.
It certainly has been quite different in the past but any user that can watch a DVD on their computer can install an [selected] OS if they want to.
Hardware compatibility is another question. If the manufacturers would just test and have a "Ubuntu ready" label then that issue would be largely moot, at least as much so as with MS Windows.
The question of hardware compatibility and other technicalities aside, the entire notion of the "operating system" being decoupled from the computer and installable separately is alien to most non-technical computer users, and the idea of installing a "third party" operating system is simply uninteresting; which is why I said "don't", not "can't". Why should they, when they already bought a computer with an "operating system", and it largely works?
My point is that the "download an .iso file from the internet, burn a CD / flash disk and install something called an operating system" model, while it's largely successful for the audience it's intended for today, is not way to advance the use of free operating systems in the broader computing ecosystem; striving towards more viable and widespread Linux-preinstalled devices is. But surely the success of the former will reinforce the latter, since a critical mass is needed.
If the answer is simply that you believe another OS will support more (though maybe not all) of her hardware, why not at least recommend that she get a Live CD, try it and see? Then she can weigh for herself the benefits of a (mostly) Free software OS against the extra cost of getting her hardware to work, instead of just accepting your guess that it's not worth it.
Sorry, this is an unreasonable expectation, and one that should be corrected, not reinforced. The "average user" has no right to expect that an operating system should work with any and all hardware, simply because one operating system has a monopoly position with respect to users and a monopsony-ish position with respect to hardware makers. It hurts all computer users, and Free software users in particular, when we allow such companies to set our expectations.
In a bizarre twist of fate, work requirements meant having to use Windows Vista (the horror!) then 7, where so far the best feature seems to be "It's not vista", so no more Ubuntu at work for me, except through VMs.
Still, Ubuntu did a heck of a lot to make Linux easier for the masses, and I have to say thanks to all those at Canonical and the Ubuntu community that made it possible.
I know I should rebuild it and it'll probably be fine, but if I was going to rebuild I think I'd use Arch instead (just a personal choice).
But the VMs of Ubuntu that I use of newer versions, yeah they're great.
Basically, I just want Ubuntu's Base Gnome package, so that from there I can add want I want.
On the other hand, Ubuntu's success is success for Linux in general, so I have nothing against it. It just surprises me.
1. Firefox and Chrome are very slow and freezing, when opening many tabs.
2. Flash plugin is crashing constantly
3. OpenOffice is very slow and freezing. Can't render properly MS Office documents.
> 2. Flash plugin is crashing constantly
Try that without the flash plugin.
> 3. OpenOffice is very slow and freezing. Can't render properly MS Office documents.
I find it more likely that Office can't save its files according to whatever is the current Office file spec.
does anyone have experience of the differences in the daily experience of using fedora vs ubuntu, especially for a clean simple experience, as a developer machine setup?
Truth is, I spend way too much time hacking at my Fedora system to make things work. It gets frustrating. I fall back to the Ubuntu systems and somehow, it's just easier. I'd comment on the details here, but it is madness.
So why even bother with Fedora at all? People. I can't stand Ubuntu people. From uneducated bloggers to Ubuntu fanbois, it drives me insane. You know there's a popular blog called OMG Ubuntu!? As far as uneducated views, there's a lot of talk about how awesome NetworkManager is in Ubuntu... you know where all that awesomeness came from? Fedora!
The Fedora community? Just amazing talented people all over the place. I can't stand to give them up. Some of the most awesome, unique, thoughtful and mindful people I've ever met all have Fedora in common.
In the end, I can't see how much feature X has to do with anything over it not existing in another distro. I can sort of see how something as shiny and new as a new Ubuntu release can be appealing, but not enough to move me. It all comes down to people, and Fedora has what really matters here.
Too bad that Canonical is unsustainable, and we'll soon lose it if FOSS enthusiasts don't shift their mindset a little bit.
But to say that Mac OS X is not as usable as ubuntu is blatantly wrong.
Just a tiny example: hitting alt and clicking anywhere on a window to move it instead of going to that top bar. Or the multiple different ways to switch between apps. etc. etc. There's probably 100 small things that I now can't live without. So I would say the statement "Mac Os X is not as usable as Ubuntu" is actually true. Surprisingly.
Conversely if you're a 'mouse person' then this may not be an issue for you.