In particular, note that the promise of kernel binary compatibility does not guarantee that an old binary will run on a modern Linux distro unless the binary is statically linked. Most user-space libraries bump their major version number every so often, so it's unlikely that the required .so's for a very old binary will be present on a new system.
Static linking is one of the nice things about the OSX ecosystem. If only Apple wouldn't unnecessarily break their runtime environment every few releases.
Probably because dynamically-linked binaries are smaller, use less memory (by avoiding duplication), and can get fixes or security updates without rebuilding or re-deploying. When you're a distro it hardly makes sense to ship a copy of libc inside every single binary. Any fix/update to libc would require re-downloading basically the whole system!
It's getting even stranger in the world of free games. When games need library fixes (very typical situation as games and engines are very interwoven), which are not yet in a distro (and won't be for some time because the library is for example not yet officially updated or maybe a certain patch simply won't be included). On a system like Windows it's no problem - modify the library source and add the dll. On Linux distro's... well, just not easy possible. Which means funny enough that it's easier to modify library sources in the in the proprietary microsoft world than in the free software world. We got freedom coming with library + distro gatekeepers so to say... which pretty much sucks (even for the library authors).
"Dynamic linking considered harmful":
OTOH, arguments of the other side, for proper polemics:
"Static linking considered harmful":
I think the days when it mattered are really coming to an end personally.
The amount of space (memory/disk/bandwidth) taken up by binary code is minuscule compared with data/video/streaming/games/etc etc
Technically no, not binary compatibility. But practical compatibility? Woo boy do they ever.
As an aside, DLL Hell was coined by Szyperski, who still works for Microsoft.
I fear however, that Apple is currently destroying this design philosophy with the sandboxing. At least the application data folders have become way more complex now and I wouldn't like to troubleshoot them anymore, something that has been always been super easy.
What runtime-breaking changes are you thinking of? Apple's been very good about not breaking the binary compatibility -- of the runtime, or of their frameworks between OS releases.
Sure, there are some differences in 64-bit systems from 32-bit systems. But, if you take a 32-bit Mac app from five years ago, and run it, it will still run just as well today as it did when it was first compiled.
To save on bloat of shipping multiple redundant libraries with every app and to guard against the perceived 'DLL hell' in Windows(which was fixed a decade ago). Ironic that ended up in 'Dependency hell' with RPMs and Apt packages requiring specific versions of libraries.
Read through this excellent discussion thread if you're really interested in the problems with linking and the lack of a Common Object Model in Linux.
The user does not see any problems except hard disk use(or perhaps a little slowness?) or when the filesystem breaks.
How about just linking it in statically and make the applications standalone?
Anyway back to the point...
>Oh boy SxS. So now you have seperate DLLs for every application.
No, only if the application uses a different version that explicitly is marked as NOT being compatible. If the version used is the same, you do not have separate DLLs for every application.
>How about just linking it in statically and make the applications standalone?
That will needlessly bloat up the application.
Assume 10 applications need library X version 2.3 and one application needs 2.2. With SxS, you will have one 2.3 DLL and one 2.2 DLL. If you link it statically, the same code will be duplicated in 10 EXEs. Multiply this by all applications and DLLs used across applications.Not to mention waiting fot 10 apps to update to fix a security bug.
You think that's good?
And as far as your example goes: From what I understand most Windows developers just use a fixed library version number to specify what dll to use. When that happens, security updates won't do any good either. IMO as long as APIs can be changed in between library versions, developers will always be responsible themselves to upgrade to the newest libraries. It's a nice idea but it just doesn't reflect the reality in the world of Business application where incompatibility directly result in monetary losses.
Sure, but a gig more for RAM would make a lot of difference. http://en.wikipedia.org/wiki/Dynamic-link_library#Memory_man...
The comments in this thread(from my OP) is a good start if you want to learn.
I think the lack of ABI compatibility in linux has more to do with incentive and economics: innovating while keeping ABI is very difficult and ressource-consuming. I don't understand why Miguel would cite Apple, as they are pretty lousy in that department, whereas MS famously spend tons of resources on that issue.
If it wasn't for this promise, I think it's much less likely that people happily developing for Unix would have moved over to Linux
This is the bread and butter of companies like RH (many companies are still on RHEL 5, and most of those have some legacy RHEL3 systems in my experience).
But I think the real reason for this policy is its simplicity. I don't know if Linux developers are disciplined enough to follow Solaris-like deprecation schedules. When you allow exceptions, there is a tendency to allow more and more of them.
And main argument is rotten as heck: backward compatibility out of the box? On Mac? Please! You have to install Rosetta for that, without it you have no chance of running that escape velocity on your shiny macbook pro. Windows 7 make you feel like it can do backward compatibility jig, until you meet ultrahyperfast running software. And no I am not talking about some console application but the second incarnation of Saints Row.
I mean I would have understood if he simply said "I have started using Mac Os insert-cat-name-here and liked it" nobody would have any problems. I really can't understand the need to declare previously used software dead, bad or other derogatory terms. You use Mac, I use linux and it's still alive and kicking, considering I got security updates this morning.
Desktop linux has utterly failed, yes, clearly. OSX grew to over 10%. Linux desktop continues to remain statistical line noise. That doesn't mean that literally no one on the planet uses it, just that a very small number of people use it.
It seems there's always going to be some people who complain that GNU/Linux isn't 'good enough' or whatever, and as a community we should simply ignore these people.
The desktop never really needed Linux, and Linux never really needed the desktop.
People seemed to like Ubuntu some years ago though.
That depends on how far Microsoft and Apple go with the tabletization of Windows and OS X in the next few years.
I've been a mac user for years, but after playing with mountain lion, I could see the writing on the wall. It was time for a new laptop, so I went with a thinkpad with Arch Linux.
I don't think that Microsoft will ever manage to completely force the app store on everyone--solely because of enterprise customers--unless maybe they force you to use it on home/basic and make it optional on business/enterprise. However, I do think that's what they and Apple eventually want. I also think that's probably what the average consumer wants.
We need Linux as a third (and 4th, 5th...) option for all of us edge cases that don't fit the standard consumer mold.
I love the trackpoint (don't have to take my hands off the home keys to use the touchpad). I also love the docking bay, with my mac I had to unplug the power, speakers and usb, now I just push a button and it pops off the dock.
The best example of this is the Arch mishandling of Python 3. Remember when they decided that "python" was now Python 3? If you are a Python dev, then alarm bells should already be ringing in your head. This is a BAD decision with no excuses.
Python is wonderful because it is very portable. I can write Python scripts on my Mac and run them on Linux or even Windows without significant changes. Python 2 and Python 3 are incompatible, and I usually target Python 2 for anything I want to be portable, since Python 2 ships default on OS X and on lots of Linux distros. This is reality.
My Python 2 scripts begin with a shebang, "#!/usr/bin/env python" which launches the Python 2 executable, wherever it is installed, on OS X or Linux. Unless you are running Arch Linux, in which case you need to go in and manually change the script to point to "python2" instead of "python". Except "python2" is essentially unique to Arch, it doesn't exist on Debian or Fedora or OS X or whatever. So no portable script is EVER going to point to "python2".
But there's no real benefit to making "python" into Python 3. Anyone targeting Python 3 will write "python3", which works wherever Python 3 is installed, including Arch Linux. This is the decision the upstream made, and it's a good decision. Anyone targeting Python 2 writes "python", which works everywhere except Arch, and on Arch you have to edit your scripts. Who wants to keep a separate copy of your scripts for Arch? What if they're not your scripts? Do you write a shell script that runs "sed" to fix them? What if you keep your Python scripts in Git or SVN, and don't want to change them every damn time you check out a fresh copy on Arch?
Worse yet, lots of Python devs want to install and test with specific versions of Python -- say, the Python version that their servers run. Arch sabotages this, because as soon as you put "python" in your $PATH, a ton of Arch programs break if they target Python 3. And they wouldn't have broken if they just said "python3" in the shebang to begin with.
The result is that Arch has to maintain a fork of EVERY Python script in their repository. For what? No real reason. The only conceivable benefit of this terrible change is that a user can type "python" in the terminal and get Python 3 instead of Python 2. I've been doing this for years with an alias in my .bashrc, and it doesn't break every Python package in the repository.
So in short, the Arch Linux developers promise a new world in which not only is binary compatibility impossible, but source compatibility for SCRIPTING LANGUAGES is impossible as well. They have taken a bad idea and stuck to it, dismissing anyone who disagrees with them by suggesting that they use "sed" to fix the user's "broken" scripts, in such a way that it would break compatibility with every distribution besides Arch.
And I have further rants about Arch with respect to (1) their dismissal of bug reports of serious security vulnerabilities in default configurations (2) the dismal performance and general usability of their package manager (3) the inability of their maintainers to create a correct package for something as simple as a font (4) the terrible average quality of advice on their wiki, which is often gives advice that simply doesn't work for well-documented and easy-to-discover reasons.
Gawd, use anything else. Gentoo, even.
Okay, I want to rant more.
(1) Security vulnerabilities are dismissed with the kind of reasoning like "users who run this package know what they are doing, and don't run the application on untrusted networks." Considering any (non-virtual) network secure is almost certainly a sign of incompetence in your sysop.
(3) Install "terminus-font", it won't work. You have to manually "xset fp" the path to it.
(4) See #3, and then look up the Wiki advice for it. It just doesn't work if you run e.g. GDM, which doesn't run ~/.xinit.
If the Python community eventually wants Python 3 to become the default, they will all have to deal with this issue sooner or later; Arch is just opting for "sooner", as that's in line with the way Arch approaches all changes. The real problem is that there are two incompatible versions of Python, and most Python developers give too little thought to forward compatibility to include an extra '2' in their shebang lines in order to disambiguate their intentions. Just because python == python2 on most systems today does not mean the status quo will forever be the same, nor that it ought to be.
In any case, there is no problem making Python scripts work on Arch, even if they use the ambiguous shebang. All of the official packages work out of the box with no problem, of course, and anything unofficial (i.e. AUR packages) just needs a single, trivial sed command added to the PKGBUILD. Arch is not "maintaining a separate fork of each Python package". This is the opposite of true; in fact Arch packages are, on the whole, far closer to their upstream counterparts than the same packages from other distros, where it is common for huge divergences to be made from the upstream defaults. Hardly worth ranting about.
You come across as if the Arch developers viciously attacked your family or something. Maybe it's just a distro with different values than yours, man! Relax, there are lots of other choices! It's not supposed to appeal to everyone, but for those of us who align well with the Arch "philosophy", it's fantastic.
My point is not that it's hard to get Python scripts to run on Arch, my point is that I shouldn't have to do any work at all.
You can see a summary: http://www.python.org/dev/peps/pep-0394/ (note it's not yet been accepted)
"Until the conventions described in this PEP are more widely adopted, having python invoke python2 will remain the recommended option."
"This feature [python2 symlink] will first appear in the default installation process in CPython 2.7.3."
Python 2.7.3 was released in August 9, 2012, slightly over one month ago. That means that if you are running a Python 2 that is more than one month old, you won't have the symlink unless you make it yourself, or your distribution provides one for you.
At work I'm using OS X machines and "administrating" them is way more effort than for Archlinux. With Arch I run a full upgrade every once in a while and if something breaks it's always trivial to fix (if you understand the system). With OS X you have to choose one of multiple broken package management systems (macports, brew, ...). A third of the packages I'd like are missing, the next third is outdated, and the last third doesn't build at all.
It may be perfectly possible that Arch doesn't work for you. But stop assuming that everyone is like you. I don't need someone telling me what to use and what not to use. If you like Gentoo (or OS X or whatever) better than just use it.
As for 3. adding /usr/share/fonts/local to the xserver's font path in /etc/X11/xorg.conf.d/20-fonts.conf does the trick. It's too bad there are no recursive search for font files.
It's kind of explained here: https://wiki.archlinux.org/index.php/Fonts#Fonts_with_Xorg
It's too bad you got bitten by The Arch Way(tm) but some people like it.
Err, what about Android?
>I don't think that Microsoft will ever manage to completely force the app store on everyone--solely because of enterprise customers--unless maybe they force you to use it on home/basic and make it optional on business/enterprise
That is already the case. To sideload apps home/pro versions you need a valid developer license. To sideload on Windows Enterprise the domain IT admins can do it.
Not really a windows guy, so I didn't know it had gotten that bad. Wow.
>Err, what about Android?
Android is far too rigid in it's current iteration to use as a general purpose OS.
There's just too much you can't do with it. Google has it's vision of what it should be and you have to fight to fight the OS to break away from that (try removing the navigation bar on 4.1).
Android could work as a general purpose OS, but it would require a major fork.
Linux doesn't need to be a threat to Apple/Microsoft in order to be a success. That is an artificial bar that some people seem to have become obsessed with.
I guess there is some merit to this argument in the sense that if Linux was the defacto standard on the desktop I wouldn't have to maintain a dual boot. On the other hand I quite like using a niche OS because it means that I'm not a target for malware/shovelware/adware vendors.
Exactly. I cannot count the number of times I have had the discussion: "You should use [MacOSX/Windows].", "Why?", "My grandmother can use [MacOSX/Windows] all by herself."
It is hard to physically restrain myself when people think that is an insightful thing to say.
Also why do they refer to GNOME as a "research" project?
As you say, others responded with external vs. internal interfaces, but what is an external and internal interface for things like Gnome or KDE? They have only one API that's used by both their other libraries/applications and application writers.
Which is a problem they without doubt did not inherit from the kernel team.
There is little doubt that the GNOME guys have been using kernel practices to justify their actions. The point being made here is that they are wrong to do so.
Interesting point, Windows, OS X, iOS etc. get derided for having internal private APIs that they try to prevent external devs from using, and not doing the same thing is now a 'problem' for GNOME and KDE? Isn't the whole point of Linux for developers, freedom to use it as you see fit?
I've yet to hear of a philosophy that allows you to both 1) use everything as you see fit, even things marked as "internal" or "private" and 2) give you the right to complain about your software getting broken because you relied on things marked as "internal" or "private".
In other words, you can either restrict yourself to the published APIs and demand compatibility or you can use the guts of the system and have no expectations of stability when the guts change.
To my mind this argument is the greatest argument against OSS for anything but hobbyist application.
To see the Gurus of the Linux world squabbling like this makes it abundantly clear why this format will never take on the established desktop OS.
And yes, figuring out why Linux soars on servers but stagnates on the desktop is a question worth finding an answer for.
Why does public debate disqualify a technology?
I think the risk for the future is that the idea that the traditional desktop is dying out. In some respects the touch capable tablet UI's are much more user friendly. On the other side, the lightweight web-only interfaces (Chrome OS) are becoming increasingly more powerful/useful.
Windows and Mac have already moved to hybrid designs.
But overall, I've found it interesting how little G+ comes up in discussions about Twitter and App.Net. I happen to think it's an absolutely excellent product and find it much more engaging than Twitter. Most people I see complaining about it have made hardly any posts and haven't taken the time to build up a network.
Probably because it's completely Apples-to-Oranges. Twitter/App.net is for short status updates. I deeply value the enforced brevity. G+ posts tend toward blog-style bloviating; the use cases really don't overlap.
At the same time, I can say I've moved a lot of link sharing to G+, so for that use case, it's a fairly direct substitute. The rich embedding is better and I can say what I want without having to cram it. It's like that Pascal quote about not having time to make it concise...G+ just lets you write a couple of sentences, or more on whim, without having to then manicure it into 140 characters.
This has nothing to do with App.net. I agree, it's one of the dumbest things I've heard of, but has nothing to do with this thread.
Seeing as long posts and discussions are starting to become pretty common on G+ I hope that the G+ team invests some time into figuring out how to make everything more readable.
The GPL license allows companies to contribute technology to the project without fear the technology they contributed will be used in proprietary products from their competitors.
Oracle, for instance, doesn't have to fear HP will use BtrFS in HP/UX and compete with Solaris' ZFS.
 We can use the existence proof that FreeBSD exists with a bsd license and it hasn't materially changed the Linux outcome.
You would most likely have ended up with a bleeding edge open source version and several more conservative closed ports.
Bear in mind that a lot of the reason Linux development is strong is because many of it's developers are paid to work on it and then contribute back (again because of GPL). If a company could gain a competitive advantage by paying the same people to work on a closed fork..
I agree though, someone would try. I just think Linux would have done just as well with a BSD license being released at the time it was. The people and the timing deserve more credit than the license.
I strongly disagree with this phrase. I'm not saying making Linux usable is 90% of the work; it isn't, but it isn't "the last mile" either (maybe that + customer support + developer relations and evangelism equals to 15-20%).
The conversation here may parallel the other but I think this link is more valuable.
1. Not clear how easy it is to dual boot install with windows for non tech users. e.g. difficult to get started.
2. Make it faster, latest ubuntu is sluggish. The UI gets non tech users excited, a few minutes of using it and they are going to bounce.
3. Make installing programs easier, I know how to get them running, I still don't know how to get them properly installed. Average Joe has no chance.
4. Support all common user tasks out of the box, for example viewing videos online.
To me it feels like Ubuntu is 90% of the way there, just a few things stop it from being a good alternative (these things of course are probably not trivial to fix, but I would say they are critical).
a. there were at least two of everything: GNOME, KDE, vi, emac, gedit, Firefox, Opera, kasablanca, ftpcube ...
b. crap device support: graphics cards, printers, cameras, thumb drives ...
c. highly variable levels of application quality and support
: That stupid "Join G+" black bar, the absolutely unnecessary chrome around the content that takes up more than 83% of screen real state on my 15" MBP, no links to comments
What's annoying isn't Unity but the way Ubuntu changes the way startup scripts are done and small things like .xsession don't run by default (and if you turn them on, suddenly .Xmodmap stops working...). That's true for the Ubuntu derivatives too. But that's a fair trade-off for good drivers (even if they are proprietary) and up to date packages. Otherwise I'd go with Debian testing.
Also, installing plugins for Gnome 3 is really convenient. For example, I didn't like Alt+Tab switching between programs instead of windows, so I installed this: https://extensions.gnome.org/extension/15/alternatetab/.
There are still some things that bug me though, like I don't see any way to configure the format of the Date/Time.
It has a very opinionated approach to things and throws away a lot of customisation options (focus follows mouse, you can't move the launcher or go back to a standard taskbar etc).
I need for example right-click menues sometimes and those are simply not supported. So the only workaround is installing packages that get me applications menues back which is then a mess where the buttons to close a window switch from left to right depending on fullscreen or windows all the time. Not that I liked top-menues to start with anyway. And certainly lot's of applications just work badly with the top-menu simply because Linux-apps never were written for that. So you get stuff like double-menues even with some applications which are in the offical distro.
Or the idea to use F10 as a system key - sorry, but it just isn't one and never was. Which is why it's used by lots of applications which suddenly miss a key now - who came up with this shit?
There is one reason Ubuntu stays on my Laptop - it works most of the time good enough with the hardware. And I can live with Unity most of the time. But Unity is the reason why Ubuntu doesn't make it on my desktop.
After 3 days I just installed Awesome and went from there. Unity provides me absolutely nothing that I want that Awesome doesn't. The only thing it does for me is get in the way.
Give me a break.
Whoever thought that was a good idea needs to be thrown off the team, forever. It drove me absolutely nuts until I finally managed to get rid of it.
I use FVWM2 at work with two monitors, when I have time I plan on moving to xmonad. I've used Dropbox and a few other lightweight WMs too. They all worked OK with my dual-monitor setup.
I've used Unity at home since the beginning, and it's the single worst multi-screen experience I've ever had. I only stick with it in hopes that I'll get to watch it evolve into something useable.
There is a strong case for someone paying developers to create a free alternative to Windows to commoditize desktop operating systems, thereby attacking Microsoft's revenue stream.
The problem has been that it's been too difficult to produce something easy enough to transition away to from Windows and Windows apps for this to work.
See Ubuntu bug number 1. https://bugs.launchpad.net/ubuntu/+bug/1
Mark Shuttleworth is that 'someone'.
>The problem has been that it's been too difficult to produce something easy enough to transition away to from Windows and Windows apps for this to work.
Everytime a Linux distro is made one bit "easier" or simpler, the power users find it "dumbified" and quit that distro and move on.
On every conference I go, I see tons of Macs. Those developers are very unlikely to find Gnome 3 or Unity too constraining for their tastes.
The problem is that most of those developers have already left Linux and would be unwilling to switch either because of the hardware they like or the Mac app ecosystem or toolchain they use (for example iOS development). I doubt a significant percentage would come back even with a better Linux.
Why don't you ask Canonical?
Where are the lawsuits against System76? If there really was demand, you think a company wouldn't start up and offer Linux? In fact there have been OEMs shipping Linux, but stopped or hid their offerings after the high rate of returns from people buying and returning them. And the tech savvy people who are the real customers can buy ANY PC out there and throw Linux on it(or just get a Macbook Air). There's not much money in it for the OEMs except high support costs, return rates and a whole toolchain to get Linux images(hard on thin margins).
If there is real mass consumer demand, a company can ship Linux profitably, MS won't care about them till they start making hundreds of millions a quarter, at which point successfully fighting a lawsuit based on antitrust case precedent on PCs or paying a low amount per PC won't really hurt the manufacturer. So, stop painting MS as the bogeyman in this discussion, Windows' and Office dominance(due to Open/LibreOffice not being up to par due to various reasons including Office compatibility) is a well known reason.
2). You are right about Microsoft not trying to litigate for Linux servers, but I'd take a guess that is because most companies already have a heterogeneous environment, and if push came to shove I would assume that most companies would consider a full switch to Linux if Microsoft forced their hand.
3). HP and Facebook are the two main companies defending Linux atm, you are right about that: http://www.theregister.co.uk/2011/04/20/facebook_hp_and_open...
4). Dell does not currently distribute any consumer laptop with Linux as an option, at least not when I tried the top 10 popular laptops on their site.
5). They don't profit from putting System76 out of business, and in fact would probably cause much more financial harm in terms of what it would do their image as picking on the little guy.
6). You are absolutely right about the low consumer demand. However, don't you think it's weird that NOBODY is pushing it? Not even as an experiment? No smartbook running Android? Why is the Asus Transformer the closest attack we see?
Wrong. It's funny you're commenting on this so vehemently with so much anger against MS in this and previous threads and you haven't even heard of Dell's Sputnik?
They do sell in places where there's more demand, and the Windows tax is high in relative cost of living terms.
They did try a while ago in the US and they didn't sell well and now they're making a dev ultrabook.
Microsoft is plenty evil, but there's no need to make up things.
Edit: How many would buy Dell's laptop? Most devs would prefer their own choice, perhaps a Macbook Air or throw Linux on a Thinkpad. How does Dell get a leg up on Lenovo(with Windows) here without making 20 laptops with Linux?
Not if they have a good excuse (patents Android OEMs consider valid enough to license).
> Hell, B&N even tried the antitrust angle in the Android tablet patents cae.
Unfortunately, Microsoft and B&N became "partners" before the list of secret patents became public.
Even a hint of any such thing will result in a multibillion dollar fine paid to the OEMs and injunction against MS in the courts. What about the fact that there is little interest in the general public for Linux(due to many reasons including Wintel and Office monopolies) and the OEMs are hampered more by that and the reasons in my OP above rather than your weak reasoning?
Have you seen how few Chromebooks have been sold? Do you really think MS hampered them with their patent licensing deals(there have been a couple) or do you think there are much bigger reasons that ChromeOS/Linux hasn't taken off ? What about new companies like System76? Why don't you start a new company that sells Linux PCs? Do you know how Dell or Compaq started? It was just one or two people in a garage assembling PCs and throwing DOS on them. The difference was that people bought a lot of them.
This whole blame MS for everything regardless of real issues attitude is getting old.
Besides, when all PC makers depend so much on Microsoft, there are many other creative ways to punish OEMs who want to be more independent without raising too many eyebrows.
I shall do exactly that.
>A. Microsoft shall not retaliate against an OEM by altering Microsoft's commercial relations with that OEM, or by withholding newly introduced forms of non-monetary Consideration (including but not limited to new versions of existing forms of non-monetary Consideration) from that OEM, because it is known to Microsoft that the OEM is or is contemplating:
developing, distributing, promoting, using, selling, or licensing any software that competes with Microsoft Platform Software or any product or service that distributes or promotes any Non-Microsoft Middleware;
shipping a Personal Computer that (a) includes both a Windows Operating System Product and a non-Microsoft Operating System, or (b) will boot with more than one Operating System; or
exercising any of the options or alternatives provided for under this Final Judgment.
>Besides, when all PC makers depend so much on Microsoft, there are many other creative ways to punish OEMs who want to be more independent without raising too many eyebrows
If an OEM can prove it, billions of dollars to them in court.
Maybe System76 can be the big Linux OEM, except that Linux folks buy Thinkpads or Macs.
Do you have any documentation that MS is suing them?