
Canonical, Ubuntu, and Why I Seem so Upset About Them All the Time - ihuman
http://mjg59.dreamwidth.org/39913.html
======
SeanDav
I am only a casual user of Linux and have no axe to grind or sacred cows to
protect. From my point of view, one of the biggest issues with Linux is
fragmentation - which of the 2000 flavours of Linux do I choose for my home
use and why. My perception of Canonical/ Ubuntu as an outsider, is they have
come along and tried to make a one-stop shop for Linux for the masses. This is
a good thing. If you are an expert you can still use one of the other flavours
or even branch Ubuntu. They are not making it impossible to create "yet
another Linux flavour" but they are not making it easy. From my point of view
this seems to be a plus and not a minus.

~~~
infinity0
I don't think you properly understood the article or FOSS in general. It is
not about protecting a "sacred cow", it is about protecting the principles
without which there will be no FOSS in the first place.

We FOSS devs rightly care less about market share because market share is not,
and never has been, what sustains the FOSS ecosystem. What sustains it is the
code content that attracts developers from all over the globe to look at it,
learn from it, be interested and inspired by it, and the principles of freedom
enables them to contribute to it and sustain this cycle.

(edit: downvotes are not for comments you disagree with. That's what responses
are for. Stop abusing the system selfishly.)

~~~
privong
> (edit: downvotes are not for comments you disagree with. That's what
> responses are for. Stop abusing the system selfishly.)

I didn't downvote your post, but your statement about downvoting isn't true.
It's certainly not in the guidelines[0]. And if one needs an appeal to the HN
operators, as others have pointed out on separate occasions where your claim
has been made, pg has previously said downvoting for disagreement is fine[1]:

 _I think it 's ok to use the up and down arrows to express agreement.
Obviously the uparrows aren't only for applauding politeness, so it seems
reasonable that the downarrows aren't only for booing rudeness._

[0]
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

[1]
[https://news.ycombinator.com/item?id=117171](https://news.ycombinator.com/item?id=117171)

~~~
infinity0
I don't know what the effect was historically but right now if there's too
many downvotes a comment gets greyed out. Furthermore you have to have over
several hundred karma to be able to do it in the first place. In other words,
the behaviour isn't symmetrical with upvoting, so I find it very
uncomfortable, even for myself, to downvote merely for disagreement (i.e. the
opposite of an upvote).

~~~
JoeAltmaier
Same here. I reserve downvoting for unhelpful comments.

One litmus test for unhelpful commenting is, that the word 'you' is used. I
try to keep it on topic, and the topic is _never_ the other commenters.

~~~
smcl
There's gonna be tiny exceptions, but this is an excellent guideline for
comments that I will try to follow in future

------
bryanlarsen
I find criticism of Canonical interesting. They are one of the most freedom
respecting software companies out there. But people don't compare them to
almost any company, they compare them against Red Hat. It seems that Ubuntu
gets criticized not so much because they are really bad but because Red Hat is
so awesome.

~~~
valarauca1
You raise a solid point. The Red Hat/Canonical comparison paints them bad. Yet
the Apple/Canonical comparison on the other hand...

~~~
cwyers
Except Apple isn't trying to subvert the intention of the GPL, it's just using
code not covered by the GPL.

~~~
vacri
Torvalds explicitly said he doesn't like the GPLv3 because the GNU people
tried to subvert the intention of the GPL and lied to people about what it
meant. The GNU people are the same ones as the main article refers to when
looking for a definition of 'Free Software'.

So we find ourselves in a situation where the article author is apparently
arguing against group X because they're trying to subvert the intention of the
GPL, and using as evidence a definition provided by group Y, who tried to
subvert the intention of the GPL.

~~~
chimeracoder
It's kind of hard to make the argument that the FSF tried to subvert the
intention of the GPL, since they wrote _both_ the GPL itself _and_ the four
freedoms (the philosophical principles on which the GPL is based) .

You can make it, yes, but it's not a very convincing one.

~~~
vacri
Well, have a crack at it, then. Here is Torvalds talking about how he sees the
point of the earlier GPL being subverted by v3, and that he thinks that the
FSF lied to people about it. In his opinion, it should have been named a
different license. He goes into detail about how it's not the _license_ he
hates, but how the FSF behaved around it.

[https://www.youtube.com/watch?v=PaKIZ7gJlRU](https://www.youtube.com/watch?v=PaKIZ7gJlRU)

> _since they wrote both the GPL itself and the four freedoms_

Also, having been part of organisations that have operated against their own
philosophical principles and mission statements, I don't accept that the FSF
is immune to changing their angle on things. And having hung around plenty of
radical progressives growing up, I've seen some handy re-rationalisations of
earlier positions.

Overall, I'm on the side of the FSF. But fucked if I'm going to be part of
this thing that progressives do, which is piss all over their allies (in this
case, ubuntu) just because their opinion is _slightly_ different to one's own.
If someone's going to raise the "nuh-uh, subversion of license = evil", then
fuck it, here is a video of someone at the heart of the whole licensing debate
literally saying that the FSF behaved immorally about licenses too.

~~~
Steltek
But isn't it healthy to re-examine your core beliefs and made tweaks or even
whole substitutions? I don't see anything in GPLv3 that directly contradicts
the original four freedoms. It was evolutionary to close loopholes like Tivo.

But you have a second point: how you treat your allies and other sympathetic
parties. Sadly, idealism seems to trend towards that behavior and it's pretty
counterproductive.

~~~
vacri
I absolutely agree that you should be allowed to change your opinion over
time, as long as you do it 'honestly'. For example, it bugs me when people
lambaste politicians for having a different opinion now as opposed to 20 years
ago. But at the same time, by 'honest' I mean being open about the changes and
not quietly twisting them or hiding them, which is what Torvalds is
complaining about in that video. In the organisations I mention I was part of,
the same thing applies - lip service paid to the (sometimes stale) principles,
but the behaviour was rationalised very baroquely to fit those principles.

Re: the idealism stuff, my armchair psychology take on it is this:
conservatives are passionate about things not changing, so minor differences
don't matter so much because the overall goal is the same; progressives are
passionate about things changing, so if the cart is going to move at all, it
really needs to go in _my_ direction. Add in the human predeliction for not
seeing the forest for the trees, and the progressives will squabble over what
really are minor differences.

------
reitanqild
My bigger problem with Ubuntu was that they went away from creating a really
nice and easy-to-use distro that I wanted to throw money at to breaking
everything just to be like Apple (don't get me wrong on this, I'm an Apple
fan, I just find their ux hard to use and think people should copy the rest of
what they do, not the menus etc.)

------
nwah1
I notice that there's an inherent conflict between the democratic approach
found at Debian, Mozilla, W3C, etc versus the dictatorial approach of Apple
and so on.

In a way, it mirrors the difference between free markets and command
economies. Free markets tend to be resilient, offer individuals the most
power, and tend to favor those with skill at the expense of lots of market
fragmentation and inefficiency. Command economies, by contrast, can have
everything working together towards one common goal, and allow for every
individual to have some measure of equality and security. They can operate
with either far more efficiency than a free market, or far less... and the
fortunes can rapidly change.

Democracies respond by then copying the innovation of the risky
dictatorial/command experiments.

Ubuntu is trying to do a little of both. It tries to be like Apple, and thus
doesn't try to conform to the general FOSS community initiatives if it thinks
they aren't innovating correctly or quickly enough. That is why they created
Unity, Mir, and many other in-house projects. Tightly integrating branding
into the products is just par for the course.

By not trying to go along with the program, they annoy everyone else. But
since the Debian Technical Committee, X Foundation, Linux Foundation, and W3C
committees are not like the Comintern or the Apple board of directors, they
have no power to enforce that program. Hence the fragmentation.

~~~
sbuttgereit
An aside regarding the details of your analogy. I have to contest that command
economies can be more efficient than free markets. Please note that this does
not mean that free markets are perfectly efficient: they aren't. But even if
we dismiss the outsized impacts that leadership personality can play in
determining the functioning of a command economy, they cannot be as relatively
efficient as a free market.

We need to define what it means to be efficient: I use efficiency to mean the
measure of how market producers meet the demands of market consumers. In these
cases a central planning authority cannot obtain enough actionable data from
the market nor can it act on it in a timely enough way as to actually meet
that definition of efficiency. Individual consumers have differing wants and
needs: differing to the point that you would be hard pressed to define the
difference between what constitutes a want vs a need in all cases. Even the
best central planners would be hard pressed to deal with such complexity. You
may be able to use prior consumption numbers to plan the next cycle, but
things change. Also adoption of new technology (not just computer technology)
is difficult to understand. When the car is invented does the command economy
order a bunch to see how it plays out? Does it order none because there were
no prior demands expressed? Can it even be invented or manufactured because
the likelihood of seeing it even get considered so small as to make it not
worth it... or needed enabling technologies like, say, better lathes (or some
such thing) not developed for a car that may exist because there is no purpose
for it now, or because there is no reward for pursuing such a path to the
inventor? How does one set prices or does one dictate consumption as well?
Fail in pricing and you have shortages for one thing and over-abundances for
another. This goes on and on. And again, these are the questions outside graft
and ill-will from the central authorities.

Free markets on the other hand distribute the penalties and rewards based
purely on how well a producer meets a consumers requirements (note that this
is different from producing a perfect world for the consumer... consumers have
a limit of what is acceptable, so you may buy something you wish was much
better/cheaper/etc than it was, but you still concluded that you'd be worse
off having the money rather than the product so you traded... a need was met).
An individual producer can focus on a an individual area of need for specific
consumers much better than a central authority can; and if the existing
producers fail some consumers, an opportunity for other producers to enter the
marketplace exists. If a producer meets consumer demand in a way that
maximizes the resources that production itself consumes, they have the
resources to continue to serve the market; if they fail to meet consumer
demand in a resource efficient manner, they cease to be in the market.

Profit is the measure of efficiency of meeting a consumer demand: you have bad
profits, you're not efficient in some way, good profits and you are. Rewards
and "punishments" from the market itself coordinate resources and do so
quickly and without needing to wait for the next 5 year plan. In command
economies, if consumer needs aren't met, rarely are there any bad outcomes for
the producers (the state). You can't choose to buy from the other guy because
there is no other guy. And to reward good outcomes? Why bother when you
control the rewards and the penalties either way?

Anyway, to bring it all home, fragmentation in the Linux world is really about
this random walk of producers trying to meet needs: the interesting question
is, "who's needs"? Ubuntu is shooting for customers: those not interested in
building the operating systems/tools and those that just want to use it. Red
Hat is similar except for a narrower set of consumers. But, say, Debian (to
some degree), or even better Slackware (a personal old favorite)... who is the
consumer? I would argue that often times it is the developer/maintainer that
is actually the consumer: they consume their leisure time and make other
consumer expenses because of their desire to engage in the activity of
developing or maintaining a distribution; OK, maybe less so Debian than
Slackware, but I think the point still applies. Many of these distributions
don't exist because an entrepreneur was trying to reach an underserved
market... they exist because the developer/maintainer set out to achieve
personal goals. As such a distribution cannot fail on adoption rates or by
using monetary success measures since the reward/punishment is really a matter
of developer satisfaction in the pursuit. That's also not to say that these
personal pursuits cannot also be commercial pursuits or that developers don't
want to see their work adopted by others, but the primary motivation will not
weed out less successful distributions because they are actually successful
for what the makers tried to accomplish. And in that sense, they are meeting a
need and that is actually efficient... just not in the way you may desire or
expect.

~~~
nwah1
Command economies can make risky bets on particular technologies or
strategies, and just reorganize everyone's life around this goal. It is very
authoritarian, often tyrannical, but the notion that it is necessarily
inefficient is just stupid.

Planning occurs in markets too. A Boeing jet takes decades to bring it to
market, and the organization must make risky long term plans and bets. They
may pan out, or may not. But every entrepreneur and startup at YC is doing the
same sort of thing. Taking losses for years, even, because of a belief in
their plan.

Sometimes, risky bets pay out. Sometimes spectacularly. And sometimes they
crash and burn. The resilience of a market economy means that not everyone has
to be on board for such a venture. Indeed, the market is a collection of
organizations that are all either slowly or quickly in the process of failing.
None of them will last forever, but they generally won't all fail at the same
time. A command economy can, and often does.

------
PhasmaFelis
I miss when Ubuntu was the unqualified choice if you wanted to recommend a
Linux distro to a non-techie.

~~~
dineshp2
It still pretty much is along with Linux mint.

The average Joe does not care much about the open source philosophy. If it's
free and works well out of the box without having to touch the command line,
they would be happy. This is where I think Linux mint wins even though it
includes many proprietary codecs.

~~~
StavrosK
I'm an Ubuntu user and I want to give Mint a spin (because my parents and
acquaintances are a bit afraid of the "you have to upgrade your whole OS"
prompt). Which desktop should I pick for a casual user?

~~~
dineshp2
Cinnamon would be a safe bet.

It looks and works like Windows 7 and most people would feel comfortable using
it.

For old computer hardware choose the lightweight Mate edition of Linux mint.

~~~
StavrosK
Great, thank you!

------
tajen
Many groups are trying to reuse Ubuntu recently. I'm thinking about Linux Mint
and Elementary.io.

One day, one new group will be more focussed on graphics and hire UX
designers, and publish a polished, paid version of Linux. As in, you can
redistribute, but if you want to subscribe to the repository which contains
the latest bugfixes and the awesome UI design upgrades, you'll have to pay -
and this is perfectly legal with free software, even GPL. We've only avoided
that until now because so many groups have a huge disdain to making money off
products (which is an awesome value - but just a value).

So it is understandable that Ubuntu tries to raise barriers of entry.

~~~
malka
The purpose of GPL is not to prevent people from earning money. It is so that
you can always modify your software yourself, or that you can seek a better
alternative because the standards are open.

~~~
LeonidasXIV
> you can seek a better alternative because the standards are open.

That's not really true, GPL says nothing about standards, you can create your
own proprietary standard with your code covered under GPL. Funnily enough you
can have GPL software and create a minefield of confusion like Oracle does
with the OpenJDK which is GPL.

I fully agree on the first point though. Sadly many people forget that.

------
legulere
I somehow have the feeling that the longer the software licenses are the more
licensing-related drama is happening.

~~~
jordigh
There was a lot of drama between BSD and AT&T even though the licenses were
short too. There is also drama over MPEG formats due to patents, again, even
despite the short licenses, which led to webm.

A short license does not mean that it will have fewer problems. It just means
it addresses fewer concerns.

At any rate, the mjg is not complaining about software licenses, but about
trademark policies. Trademarks are a very different set of laws than the
copyrights that software licenses typically handle.

------
gcb0
so, just use Debian.

------
pekk
If you have the source, it shouldn't be difficult to build binaries, should
it?

If you are building a new Linux distribution based on Ubuntu, I don't even
understand why you would want to reuse binaries they built.

This post would be more meaningful if it literally quoted from unfavorable
license terms used by Ubuntu to actually prevent people from making
derivatives. Saying that Canonical "appears to require" something is using
weasel words.

~~~
mjg59
> If you have the source, it shouldn't be difficult to build binaries, should
> it?

You'd think, but no - there's no guarantee that the shipped binary packages
can be built with the shipped toolchain or shipped dependencies.

> If you are building a new Linux distribution based on Ubuntu, I don't even
> understand why you would want to reuse binaries they built.

Because it's entirely unnecessary? Building the entire archive is a huge
amount of effort.

> This post would be more meaningful if it literally quoted from unfavorable
> license terms used by Ubuntu to actually prevent people from making
> derivatives. Saying that Canonical "appears to require" something is using
> weasel words.

The terms are at [http://www.ubuntu.com/legal/terms-and-
policies/intellectual-...](http://www.ubuntu.com/legal/terms-and-
policies/intellectual-property-policy) \- I probably should have linked them,
I've just been writing about this enough lately that it's easy to forget that
people might read a single post without context.

~~~
lmm
> You'd think, but no - there's no guarantee that the shipped binary packages
> can be built with the shipped toolchain or shipped dependencies.

If the shipped source doesn't contain enough information to actually perform a
build surely that's a straight-up GPL violation (and a far more serious one
than any possible ZFS issue, and one that you're in a position to do something
about as a copyright holder). If the shipped source doesn't include whatever
their developers actually use to build (and I refuse to believe an
organization like Ubuntu wouldn't have a unified "build it all" script, or at
least a README that described the steps you needed to take) then how can it
possibly be the preferred form for making modifications?

~~~
mjg59
> If the shipped source doesn't contain enough information to actually perform
> a build surely that's a straight-up GPL violation

Package A's source code contains some undefined behaviour that gcc 4.4
tolerates. The distribution upgrades to gcc 5.1 and the build now breaks, but
nobody notices because no new version of Package A has been uploaded and so no
new build has been performed. Is your position that it was in compliance with
the GPL before the gcc transition, and in violation afterwards?

~~~
lmm
No. If the package A developers when working on the code just use whatever
version of gcc's installed and don't have any process then they're compliant
(but dumb). But if they have a script, or even just a "building.txt" that says
"install gcc 4.4 and set these environment variables, build is broken on 5.x,
known issue", and they exclude that from their source distribution then
they're in violation. They need to ship the preferred form for making
modifications i.e. what they would use themselves or give to a new developer
on their team. It's no different from the example of C source generated by a
perl script (you can't ship the generated source, you have to ship the
original script), and it's exactly what's necessary for the four freedoms -
end users need the same facilities that the original developers have when
working on the code, not a watered-down version.

~~~
mjg59
> But if they have a script, or even just a "building.txt" that says "install
> gcc 4.4 and set these environment variables, build is broken on 5.x, known
> issue", and they exclude that from their source distribution then they're in
> violation.

They don't.

~~~
lmm
Really? They being Canonical? They just build their binaries with whatever
happens to be installed on Bob's machine this week and then that becomes their
official release?

~~~
mjg59
They build their packages with whatever is in the distribution at the time,
yes. It happens on the build daemons rather than on the developer machine.

~~~
lmm
Then how can you get into the situation that the shipped binary packages can't
be built with the shipped toolchain or shipped dependencies - isn't that what
the build daemons do, use the system to build itself? I mean if they use the
previous release or something that's fine too. Whichever way they do it,
surely it's documented internally, because it's the kind of thing developers
would need to know when developing. I can't believe it would be a case of "if
the build doesn't build ssh into the build daemons and update them until it
does".

I mean a developer working at Canonical, trying to make a change to one of the
packages they're preparing for distribution, has to have some way to answer
the question "what version of gcc is being used to build this package".
Whether that information is kept in the source tree, on their wiki, or on a
post-it on the employee fridge - in any case, it's part of the source in the
preferred form for making modifications, because it's, well, part of what the
employees use to make modifications.

~~~
mjg59
> Then how can you get into the situation that the shipped binary packages
> can't be built with the shipped toolchain or shipped dependencies

The distribution isn't rebuilt every cycle. If no new source release has been
uploaded, the existing binary will be used for the next release.

~~~
lmm
So if they need to apply a small patch to a given package they'll sometimes
discover that it was built 3 versions ago and no longer compiles? Yuck.

