
A Generation Lost in the Bazaar  - mahmud
http://queue.acm.org/detail.cfm?id=2349257&ref=fullrss
======
phkamp
OK, I'll make a couple of general observations here.

First: It would be a big help for this discussion, if we could have the
informal convention that people who were employed in an IT job before 1990
marked their post. I think it would show a quite clear divergence of attitude.

Second: It's very obivious, that a lot of you have never been anywhere near
the kind of software project posited in the "cathedral" meme, instead you
project into the word whatever you have heard or feel or particularly hate.
That's not very helpful, given that there is an entire book defining the
concept (I belive it's available online on ESR's homepage, how about you read
it ?)

Third: No, I'm not of the "either you are with us, or you are against us"
persuation. The bazaar is here to stay, but having everybody in need of
transportation buy the necessary spareparts to build a car is insane.

Fourth: Related to point two really: A lot of you seem to have little actual
ambition of making things better, I guess that is what happens if you grow up
in a bazaar and never even experience a cathedral. I pity you.

~~~
jballanc
First: Have you ever looked at a plot of people for/against mariage rights for
gays vs age? I suspect that the clear divergence does exist, but may not mean
what you think it means...

Second: I worked on OS X. In fact, I worked on Snow Leopard, and the start of
Lion. What's interesting about that, is that Snow Leopard was the last version
of OS X developed according to the "Cathedral" model. Also, while the Snow
Leopard cathedral was being built, iOS was being developed firmly using the
bazaar model...

Third: You know, I wonder if you've ever _been_ to a bazaar before? I live in
Turkey, where the bazaar is a way of life (and where one of the largest,
oldest bazaars in the world is located). I've never found "spare parts" at a
bazaar. What I have found is some of the highest quality jewelry, tapestries,
rugs, and other hand-made goods you'll find anywhere.

Fourth: I think you're conflating "good quality"=>Cathedral and "poor
quality"=>Bazaar. The only thing that distinguishes the Cathedral and the
Bazaar is whether or not there is one single individual in whose head the only
valid vision of the completed project exists. You might do well to read up a
bit on the history of Kapalıcarşı. Throughout its history there were guilds to
enforce authenticity and all manner of quality control mechanisms. It is
possible to have a Bazaar _and_ a very high quality product.

~~~
thebooktocome
According to phkamp, iOS is a cathedral kind of project.

I was hoping someone would bring up OS X and how horrible its cathedral-born
API became around 10.5, but I was born in '85 and therefore have no right to
speak in this thread.

~~~
jballanc
While Apple would certainly like you to believe that the image of iOS sprang,
fully-formed, from the mind of Steve Jobs like some sort of medieval
cathedral, the reality is anything but. In fact, much of Apple's success is
due to the extent to which it functions like a confederation of very well
funded startups. Cupertino is very much a bazaar, wherein a shopper with very
refined tastes (Jobs, for example) can pick and choose the finest wares.

This article concludes that quality only happens if someone takes
responsibility for it. Yes. You'll find no argument on that point from me. But
cathedral's are _not_ (edit: should say "not _only_ ") about one person taking
responsibility for quality. They are about _a priori_ design.

Quality control can happen _ex post facto_ but creativity, once strangled,
dies.

~~~
ori_b
I'm going to drop the cathedral and bazaar analogy, because I don't feel like
debating the exact meaning of the terms.

It seems to me that quality is about having someone on the top who is willing
to give direction and vision to a project, instead of having every person who
scratches their own itch full influence. Contributors are great. But letting
everyone pull in their own direction doesn't lead to something that feels well
engineered.

PHK seems to be calling this vision and cohesive design (a-priori or not) the
"cathedral model". I agree strongly that for a project to work well and feel
"high quality", it needs some source of a unified idiom.

~~~
jballanc
The problem is that coherent vision and _a prior_ design are NOT equivalent.
Obviously, I cannot know what was going through the head of ESR when he wrote
the original essay, but what I've always taken from it is that _a priori_
design is inherently inflexible, prone to becoming disconnected from reality,
and ultimately less inviting to creativity.

If the suggestion of the article is that the only way to retain quality is to
move back to the "single vision" world of _a priori_ design, I'm sorry...that
ship has sailed, the cat is out of the bag...whatever your favorite analogy,
PHK is very right that the new generation has gotten used to _not_ having
instructions handed to them.

Of course, anyone is free to start a project with a single vision, recruit new
members, and do their best to grow the project. I suspect, however, that such
an effort would loose out to one that figures out how to develop a coherent
vision without the need for _a priori_ design.

~~~
phkamp
Brooks spends some time discussing these issues in the book, and I'm pretty
much aligned with him:

Nobody belives in "a priori design" and I somehow doubt that anybody did.
Brooks points out that the original publication of the "waterfall model" was
meant as "how not to..." and people got that wrong.

But cathedrals are not about a priori design, they are about style, elegance,
economy of means and coherency of design.

But my point in the piece is that the lost generation doesn't even know what a
cathedral is in the first place, having grown up in the bazaar.

~~~
jballanc
But did they grow up in the Bazaar? or in the Mad House?

ESRs original prototype for the "Bazaar" was Linux. Do you feel that Linux is
lacking in coherency of design? You refer to the dot-com bubble, but the
problem with the bubble wasn't, I think, that it was the "Bazaar". Indeed, I
don't think the dot-com bubble of the late 90s was characterized by much open
source development at all!

It was, rather, consumed with "flashiness" and "wow factor". I definitely see
the continued obsession with these things as a problem. I would say that, for
example, much of the obsession with Node.js today is a consequence of this
obsession. But that isn't a Bazaar.

It's a disco.

------
jng
As someone who grew up as a programmer with assembly language, C, and C++,
learning the good practices needed to make a 1 million lines of code C++
application work and be maintainable:

I've been equally disgusted by the evolution of programming in the last few
years. HTML - which is, to a first approximation, always invalid. JS - which
needs jQuery to make it mostly-but-not-completely cross-compatible among
browsers. CSS, which hides so many hacks on top of each other, and which even
needs a reset to be compatible. And now - distributed systems with pieces in
PHP, Python, Objective C, Dalvik-Java, etc... and sustained by awesome-but-
hackish fixes like Varnish and FastCGI and Nginx. where everything seems to be
put togetther with duct tape.

But I'm slowly getting to the next stage, and having to admit - no, actually,
admitting - that if it's spread like fire, there must be something to it, no
matter if it hurts our taste so much.

And if you look well into it, it's very similar to biological evolution. Our
own genetic codes and body plans are full of ancient pieces that are not
needed any more (or at all, male nipples anyone), but it seems that it was
more economical to "patch" things than to fix things properly. Or at least, it
didn't hinder the current designs enough that they wouldn't succeed over the
alternatives evolution probably also tried.

And now Google translates text statistically, without even trying to
understand things.

One fear I have is that we will be able to create a working AI in a few
years... and due to the way we do it, we may even not understand how it works.

~~~
lerouxb
Nitpick: jQuery exists to make the HTML DOM manageable, not JavaScipt the
language. (There are libraries targeted at javascript the language, but you
could make an argument that that's the point of all libraries for all
languages...)

I find jQuery more analogous to tools like configure and autoconf. (in that
they aren't actually needed for "modern", standards compliant browsers.) As an
example: there are already many lightweight drop-in replacements that assume a
sane browser to begin with.

CSS is remarkably hack-free and the resets are just to remove the default
styling that browsers have built in. Obviously browsers need builtin styles
otherwise all the old pre-css pages would stop working.

And why are you calling programming languages and web servers hackish? How is
nginx hackish and apache not? How is Objective C hackish and C++ not? Why stop
inventing new languages at assembly, c or c++?

Other than that I generally agree with the rest of your comment. I think if we
ever get to some kind of technological singularity we'll almost certainly and
just about by definition not understand how it works. And true/strong AI would
definitely be a singularity. Even if we understood version 1 we would likely
not be able to understand whatever it dreams up 5 seconds after we overclock
it. And we will - moor's law and all that. ;)

~~~
jng
I don't mean Nginx or Varnish are more hackish themselves than Apache. I could
have said Apache instead of Nginx, but I think

Back in the day, a "serious" application usually involved a large amount of
quite homogeneous source code in a single language (or 2 or 3 different
languages for different tasks: C/C++ for the main code/engine/logic, assembly
for performance-sensitive code, and maybe some custom high-level script for
high-level application logic). This was built usually on a single machine with
a single build script, and resulted in some binary which could be deployed.

Even early web apps were more akin to this model - Java apps, or even Perl
apps.

Nowadays, an "app" lives distributed among dozens of servers and client types.
People describe what they have: "12 memcaches, 5 varnishes, 8 nginxes, 5 app
servers with Ruby, 4 static content servers, and a MySQL master server with a
fallback master and 5 slaves for reads. Separately, an Android app, an iPhone
app and an HTML5 front-end."

These systems started as a single app in a single box, and have "grown" for
scalability, reliability and security, being patched up with different pieces
of technology put together with often unreliable methods but fallback
mechanisms that make it more resilient that if reliable methods were used but
no fallback. It's not so important nowadays that the application code be
clean, elegant, or failproof, but that measures are put in place so that the
service will keep chugging along most of the time.

And setting up the whole system from scratch involves a manual with probably
hundreds of steps putting together haphazard technologies, installing
different types of Linuxes and packages for each piece. And may even involve
difficult-to-replicate steps, such as using an AMI to launch EC2 instances,
where the engineer that created the original AMI doesn't even work here any
more and recreating it from scratch would involve 20 packages, Googling for 5
of them as they're not available in normal repositories, 5 manual patches, two
secret incantations and a tribal dance around the chair while singing and
praying to arcane gods of long-forgotten package managers.

{{Side-note: although it's not the main point, I'd definitely say Objective C
is more hackish than C++. C++ has its own amount of weird stuff due to being
built on top of C and its evolution with templates, etc... but the evolution
of Objective C is done much more in the form of "patches". Even the original
syntax takes advantage of awkward gaps in the syntax of C (@interface,
@implementation, #import, [object msg:param]?). But it's even worse how new
features are piled on top of old ones: properties, ARC 2.0, etc... it's
distinctly noticeable when you just wouldn't feel comfortable teaching
programmers new to Objective C how to use ARC, without learning the underlying
memory management model first. In C++, people can learn "new" and "delete",
and never hear about malloc()/free(), and there's no problem at all.

Anyway, my point about hackishness was not referred to ObjC, but to how whole
systems are engineered nowadays.}}

~~~
lerouxb
You're basically describing the very nature of distributed systems :) We see
many more of them nowadays because of the scale of some modern web apps /
sites whereas in the 1990s the internet just wasn't that big or that
complicated. Whether every app that ends up being that complicated actually
needs to be like that is a different question.

It also comes back to "don't reinvent the wheel" and "not invented here
syndrome". You _could_ implement many of these things in your homogenous
codebase or you could just reuse something that already works. I'm just saying
there are serious and good counterarguments to the "large amount of quite
homogenous source code written in a single language" approach.

I agree with some of the details (like AMIs, EC2, different types and versions
of linux, different package managers in the same system), but I think in
general the things you think of as hacks aren't hacks at all and just the
signs of progress. Or at least the nature of large, complex modern distributed
systems. You basically described the term "agile development" and you're
remembering a time many developers would rather forget with rather rose-tinted
glasses.

(I don't really know enough about objective-c to comment.)

------
chwahoo
This article reads like an old timer feeling left behind by the current rate
of progress who thinks that the problem is really that the rest of the world
is doing it all wrong.

He's probably right that lots of software could be designed better, but I
think he's wrong that that's of paramount importance. We need lots of software
these days and we simply don't have the resources to built it to Kamp's
standards. Also, we've learned that our requirements change so fast that his
beautiful design would quickly be twisted into the pile of hacks that he
hates.

He notes how long it takes to compile the software that runs on his work
machine---how long would it take to compile the software on a windows machine?
(Or whatever he would claim is the standard-bearer of his cause).

He also attacks open-source software. The truth is, we have an amazing amount
of free and open source software available. Some of it may be flawed in design
or usability, but it enables us to solve so many problems (and look at/modify
the software when needed). I don't think that all software needs to be "free",
but I do think free software has made us all much richer. I have a hard time
seeing all this value that the Bazaar has created as inferior to slow-moving,
centralized, big, up-front design development.

~~~
phkamp
I wonder if you bothered to check who wrote that article, before you started
speculating about things you could have found out with a few google searches ?

You seem to assume that cathedrals are "slow-moving, centralized, big up-
front" designs, where did you get that idea ?

Ever looked into how USA put a man on the moon ?

Maybe you should. Also: Read Brooks book, if you can.

~~~
chwahoo
Why does it matter who you are? I've tried to respond to what you wrote.

Why do you think most of the software we build now is like putting a man on
the moon? I think it's nothing like that.

Maybe I should read Brook's book (I've enjoyed MMM), but your article doesn't
make a strong case for it.

~~~
DanBC
> Why does it matter who you are? I've tried to respond to what you wrote.

Because you claim he's attacking open source software, and you said that maybe
he thinks MS is the standard bearer of his cause.

The (easy to find) facts are that he has committed a lot of code to open
source and has been involved in freeBSD for many years. These are not small
trivial bits of code used by a few people or for insignificant reasons. Crypt
and varnish are bits of code getting billions of uses.

~~~
chwahoo
If the author's open-source credentials are part of his argument, then he
needs to make them part of the story he tells in his article. He shouldn't
assume that people will connect the dots (he doesn't have the name recognition
of a Torvalds or Stallman). Further, even with the context, I don't think the
dots add up to enough of a picture to understand what he's arguing
for/against.

To make a convincing argument, he should have made it clearer what he was
arguing against ("the bazaar" is too nebulous and refers to many things, .com
development refers to other things) and what he was arguing for (which was
entirely missing, other than a reference to Brooks' book). If he doesn't have
a clear vision of how development should work (or a "standard-bearer"), he
should have at least provided some examples of things working the "right" way
(or better).

He should probably also have left off a bunch of the insults and hyperbole:
"clueless" / "hacks" / .com period a "disaster" for code quality. I'm frankly
pretty surprised that ACM Queue would publish this.

~~~
Munksgaard
In addition to what chrisaycock said, phks writings are typically very
"lacking" and assumes a certain knowledge about whatever he has chosen to
write about. He writes interesting stuff, a lot is implied and/or not very
sufficiently explained.

~~~
phkamp
Or maybe, just maybe I don't write for the lowest common denominator, but for
people who can think for themselves.

~~~
mcguire
He's also a little bit fond of personal attacks.

~~~
DanBC
A bizarre feature of some of the developer mailing lists and Usenet groups is
the absolutely hateful vicious toxic nature of them.

OP's site bikeshed.org (<http://bikeshed.org/>) has some nice proposed
features for software that posts to large audiences.

    
    
         +------------------------------------------------------------+
          | Your email is about to be sent to several hundred thousand |
          | people, who will have to spend at least 10 seconds reading |
          | it before they can decide if it is interesting.  At least  |
          | two man-weeks will be spent reading your email.  Many of   |
          | the recipients will have to pay to download your email.    |
          |								   |
          | Are you absolutely sure that your email is of sufficient   |
          | importance to bother all these people ?                    |
          |								   |
          |                  [YES]  [REVISE]  [CANCEL]                 |
          +------------------------------------------------------------+
    
          +------------------------------------------------------------+
          |	Warning:  You have not read all emails in this thread yet. |
          |	Somebody else may already have said what you are about to  |
          |	say in your reply.  Please read the entire thread before   |
          |	replying to any email in it.                               |
          |								   |
          |			     [CANCEL]                              |
          +------------------------------------------------------------+
    

Perhaps any forum software needs these, as well as buttons saying [really post
this?] [save this angry version locally, and give you time to write a calmer
version] etc.

~~~
dalke
I remember that my newsreader back in the early 1990s did exactly this. I had
trouble posting the first few times because I took the message to heart and
didn't want to waste other people's time. But I quickly realized that other
people didn't follow the same suggestion, which meant that I should ignore the
message, and even interpret it as wrong.

Why is it wrong? For example, consider the second message. Suppose you haven't
read all the messages in a thread because you've been on holiday for the
previous few days. But a friend pointed you to a message in the thread which
specifically mentions your name, asks "could you verify this for me?", and
doesn't have any followups?

Why should your newsreader force you to read all of the thread in order to
answer something which isn't in the rest of the thread? Yet the only option
there is "cancel".

Furthermore, "at least 10 seconds" is completely wrong. People killthread,
plonk people, and develop other ways to ignore discussions. Assuming 4 hours
per day means people can handle at most 1440 messages. There are 276 comments
already in this thread, but I expect most people sampled a few messages on
each major branch, skimmed a bit, and perhaps did a text search to see if
someone mentioned a key word. Most assuredly, the entire HN readership did not
read this entire thread.

------
calinet6
Architects don't build cathedrals anymore, either. Neither do most software
architects. Yet, look what they have built: an incredibly diverse collection
of structures of all shapes and sizes, all working together to form an
imperfect yet efficient system. Surely its success is due in part to its
flexibility and _imperfection,_ and the fact that they are no longer over-
engineered and inflexible behemoths made from stone.

If you try too hard to design a rigid structure around anything, it can come
back and bite you in the ass. Part of the reason why this seemingly
disorganized and haphazard collection of buildings makes a working and
thriving metropolis is because of its diversity and resilience to change and
progress. If you attempt to over-engineer something of such complexity, you
might just end up with Pyongyang instead of New York. Instead, we have a
competing ecosystem of libraries and options with the good ones theoretically
rising to the top. Better communities and communication of these values make
this process work even better, just as in the greater economy. For all its
complexity, this process works surprisingly well.

For all the explaining Kamp did in this article, the one thing he failed to
account for was the resounding success of the current model. He did use a
surprisingly apt metaphor, however. Cathedrals are no more; their old place as
the center of society had to be explained by a series of myths and lies that
placed ideals above reality. The thriving Bazaar easily replaced them at the
center of any thriving modern city, based on the simple truth that the dynamic
edge of reality was ever-changing, and that quality based on that reality
would indeed be more successful.

Could it be better? Of course. But it is reality and truth that will move us
forward—not mythology. Good riddance to cathedrals.

~~~
phkamp
So you don't count the iOS and Android OS'es as cathedrals ?

Have you never wondered about their surprisingly coherent APIs and wondered
why that was so different from, say, UNIX ?

See my other comment about knowing what a cathedral is to begin with.

~~~
calinet6
Fair, but I would describe them more as well-engineered skyscrapers. Or even
simply good city planning. They're large structures designed around the idea
of flexibility and growth from the outside. Ironically, both of these
described platforms have orders of magnitude more functionality in their
'bazaar' app-stores than the foundation itself. The foundation enables the
market to work, but in itself it is simply a very well-designed support
system.

I guess you're saying that there's no reason UNIX should be different—that it
has somehow become a disorganized mess of libraries and code re-use, and that
it is broken.

I don't see this. Android and iOS are like brand new cities; Dubai and Abu
Dhabi. UNIX is Paris. Perhaps you can make some updates to the city
infrastructure, but in the end the streets are still cobblestone underneath,
designed for horse carts and not cars. But you have the Notre Dame at the
center (ironically, a cathedral), the architecture, tons of history, you have
a thriving culture and countless people working and playing in this city every
single day with great success. You can complain about it all you want, but
there is a reason that people are fundamentally attracted to it.

I think you're right about a lot of things, but if you propose rebuilding
Paris, well, you may have some opposition. I think you might want to build a
new city, rather than trying to build over very old history with some ideal of
the cathedral you think it should have been. And to use your examples again,
that strategy has been very successful for Android and iOS.

~~~
phkamp
And now you are starting to see what is meant by "cathedral": Not a place of
religious worship, but a construct with a coherent vision.

Read Brooks book.

~~~
calinet6
I knew exactly what you meant by "cathedral," I simply extended the metaphor
even further because I thought it was quite strikingly appropriate.

Will do, thanks for the thought-provoking article.

------
bguthrie
This article confused and dismayed me.

Since 2001, we've started in on the XP/Agile movement, which, think what you
will of it, test-drives and version-controls relentlessly, and fosters a
constant dialog about what quality is and how to achieve it.

Furthermore, some marvelous tools have been written in the last decade; I can
hardly see how the author can complain about version-control systems, as Git
and Mercurial--both miles better than what was available in 2001--are not
least among them.

Other quality-enhancing things that have happened since 2001: continuous
integration, build pipelines, Selenium and friends, behavior-driven
development, REST architecture triumphant over the cathedral-like SOAP and
XMLRPC, JSON over XML, lightweight messaging, the adoption of small, focused
open-source tool over "quality-focused" large vendor bullshit-enterprise
tools.

To the extent that UNIX sucks now, which it doesn't, it's because the hackers
who work on it now have lagged the industry--not the other way around. So go
find some other group of kids to shoo off your lawn.

~~~
phkamp
You mean: "In 2001 we reinvented XP/Agile because we couldn't be bothered to
read the old literature to see if somebody else had done something like that
before" ?

~~~
__alexs
I don't think that's quite fair on Fowler. C&B was published only 2 years
before his book on XP, The Toyota Way was published in 2001 as well.

Really many of ideas which we've come to associate with "Agile"-like methods
only really started to become codified about the end of the 1980's in more
traditional manufacturing. I don't think it's too surprising that it's taken
the software industry another 10 years for this stuff to become mainstream.

Also from your other comment above...

> Ever looked into how USA put a man on the moon ?

I'm sure FreeBSD would be a lot better if you gave it $25 billion USD over 11
years as well.

It takes more than just good will and intelligent people to produce high
quality results.

~~~
phkamp
I think it's perfectly fair.

UNIX was originally written as pair-programming.

Brooks refers to agile in The Mythical Man-Month (he didn't invent it, check
his references)

~~~
randallsquared
_UNIX was originally written as pair-programming._

From [http://www.drdobbs.com/open-source/interview-with-ken-
thomps...](http://www.drdobbs.com/open-source/interview-with-ken-
thompson/229502480) :

 _KT: I did the first of two or three versions of UNIX all alone._

Later, he emphasizes that they didn't even _look at_ each others' code:

 _DDJ: Was there any concept of looking at each other's code or doing code
reviews?

KT: [Shaking head] We were all pretty good coders._

~~~
gits1225
>>Later, he emphasizes that they didn't even look at each others' code:

I remembering reading somewhere (Dennis Ritchie's C History article?) that
once they both programmed a solution without consulting each other, and later
when they read it, the solution was exactly the same, even down to the
variables' name.

Power of C? Two people who thought alike?

>>KT: [Shaking head] We were all pretty good coders.

Which the modern world doesn't have enough of. Demand is so high that quality
of supply doesn't matter, just enough to satiate the demand; which is in stark
contrast to the early days, when computer science was a research field only
privy to scientists and later to hackers and crackers. When it went
mainstream, and grew to an industry, its no wonder that it acquired all that
which comes with it.

------
luckydude
I'm another grumpy old man, much like I imagine the OP. I have mixed feelings
about this topic.

I found the autoconf comments amusing, we have supported a pretty broad range
of platforms, from arm to 390's, IRIX to SCO, as well as Linux, Windows,
MacOS, and our configure script is 157 lines of shell. Autoconf had its place
but I think it was much more useful in the past and now it is baggage. The
fact that it is still used (abused?) as much as it is sort of speaks to phk's
points.

I think the jury is still out on whether the bazaar approach is better or not.
It sure is messier but it also seems to adapt faster. I worked at cathedral
places like Sun, and while I continue to emulate their approach I also
question whether that approach is as nimble as the bazaar approach.

I've voted for the older style of more careful development and I think it has
worked pretty well for us, we can support our products in the market place and
support them well. The cost of that is we move more slowly, we tend to have
the "right" answer but it takes us a while to get there. Bazaar approaches
tend to come up with answers of varying quality faster.

It's really not clear to me which way is better. I'd be interested in hearing
from someone / some company that is supporting a lot of picky enterprise
customers with some sort of infrastructure product (database, source
management, maybe bug tracking) and making a success of it with a bazaar
approach. Seems like it would be tough but maybe I'm missing some insight.

------
acdha
The odd part about this “get off my lawn” article is that PHK has already
shown how to fix the problem: Varnish is both a very good tool and one which
has gotten attention and compliments for rejecting obsolete convention (e.g.
relying on the VM, requiring a C compiler to be installed on a server, etc.).
I would love to see autoconf massively simplified or outright avoided for most
projects and the best way to do that would be to start providing good examples
of how unnecessary it is even for major projects.

The way to address oversights in the bazaar model isn't to cram everyone back
into the cathedral but to build support for change by showing where something
is clearly better.

One area where PHK might see this is the way Linux has flown past FreeBSD -
not due to endlessly-debated questions of kernel superiority but rather
because Linux distributions like Debian provided a clearly superior experience
for the overhall system by rejecting the decades of accumulated hacks (i.e.
the ports system, monolithic config files, etc.) which had been the status quo
for years and building better tools to reduce management overhead.

~~~
hollerith
>Linux has flown past FreeBSD . . . because Linux distributions like Debian
provided a clearly superior experience for the overhall system by rejecting
the decades of accumulated hacks

Debian contains tons of accumulated hacks that I wish would be rejected.

~~~
acdha
I won't disagree with that entirely but the Debian / Ubuntu community has
shown considerably above average willingness to make major changes to fix
that.

~~~
flatline3
They mostly seem to make a mess while never achieving a design anywhere near
as coherent as that of OS X, or as straight-forward as that of the BSDs.

------
wglb
A very thought-provoking article. (Been programming for pay since 1966 here.)

A lot of the commentary seems to be confusing the waterfall model with the
idea of the cathedral.

Another way to make this point is to note how few programmers these days read
Dijkstra, or Knuth's TAOCP. (Knuth would have us believe that even he doesn't
read it. See Coders At Work.) Among other things, Dijkstra taught
understanding the entire program before setting down one line of code.
Contrast this with TDD (which, believe it or not, some take to mean Test
Driven _Design_ ).

Lately I have been in the Application Security business, and nowhere has the
issue highlighted by the article been more obvious.

Edit: Mark Williams Company, inventor of Coherent OS, was not a paint company.
It started out as Mark Williams Chemical Company, manufacturing Dr. Enuff, a
vitamin supplement.

~~~
wonderzombie
So if I wanted to start reading Dijkstra, what's the best place to start?

~~~
wglb
His hand-written notes are here <http://www.cs.utexas.edu/~EWD/>. One biting
example is <http://www.smaldone.com.ar/documentos/ewd/EWD707_pretty.pdf>.

His Discipline of Programming [http://www.amazon.com/Discipline-Programming-
Edsger-W-Dijkst...](http://www.amazon.com/Discipline-Programming-Edsger-W-
Dijkstra/dp/013215871X)

There are some here <http://cs-exhibitions.uni-klu.ac.at/index.php?id=31> that
might overlap with the texas ones. One famous one is about the cruelty of
teaching computer programming:
<http://www.cs.utexas.edu/users/EWD/ewd10xx/EWD1036.PDF>.

His wikipedia page has links to a number of his seminal ideas.

------
cs702
Yes, the bazaar is like evolution: messy, inefficient, and slow. Lots of bad
ideas are tried; a lot of them stick around for as long as they provide more
value than they subtract; and progress takes a long time. Just as the human
body has components that are useless today (e.g., the coccyx), evolving
software ecosystems always carry a lot useless baggage. That's how evolution
works.

But Evolution copes better than intelligent, top-down design with the evolving
constraints of a market landscape that is constantly shifting.

~~~
phkamp
... but evolution will also quite happily run you over a cliff which you have
been avoided if intelligence were applied.

Not to mention the fact that the bazaar is never going to put a man on the
moon.

And yes, I wrote that piece.

~~~
jballanc
Except that evolution _did_ put a man on the moon...unless you are one of
those who don't believe humans are the result of 3 billion years of evolution?

What you have to consider is that individual cells evolved, until they created
a whole greater than themselves. No individual cell was in control of that
first step Armstrong took onto the moon. Likewise, what we end up creating
from so much "bazaar" development will almost certainly accomplish something
amazing...

...without any of us being able to take credit for it.

~~~
IvarTJ
Right. Not never, but after 3 billion years.

~~~
SeanLuke
And an incredible amount of energy from the Sun.

------
rektide
Lambasting libtool for providing a consistent experience across star-NIX is,
imo, not the wisest move for a FreeBSDer.

Article: _This is a horribly bad idea, already much criticized back in the
1980s when it appeared, as it allows source code to pretend to be portable
behind the veneer of the configure script, rather than actually having the
quality of portability to begin with. It is a travesty that the configure idea
survived._

Good high-minded notions here. But configure, with it's standardized
parameters for how to do stuff, is near irreplaceable at this point. Certainly
a more stripped down version, one not built on M4, would be wise, but
libtool/autoconf itself is used too broadly & with trepid familiarity by
developers & upstream maintainers: in spite of so much of it being indeed old
deprecated no longer useful cruft, the best we can hope for is duplicating a
wide amount of the existing functionality in a cleaner manner.

But at what cost would reimplementation come? How many weird build targets
would for months or years go unnoticedly broken?

The place where we escape these painful histories is where we leaving the old
systems programming languages behind. Node's npm I'd call out as a shining
beacon of sanity, enabled by the best most popular code distribution format
ever conceived: source distribution, coupled with a well defined not-
completely-totally-batshit algorithm for looking for said sources when a
program runs & at runtimes goes off to find it's dependencies:
[http://nodejs.org/docs/latest/api/modules.html#modules_all_t...](http://nodejs.org/docs/latest/api/modules.html#modules_all_together)

~~~
phkamp
Why should I as a FreeBSD person not be allowed to lambast the worst hack-on-
hack-on-hack-on-hack I have to suffer ?

And trust me, libtool is replaceable, all it takes an agreement about a
compiler and loader flag for producing shared libraries and you suddenly don't
need it at all.

~~~
rbanffy
But that agreement would be based on our current best understanding of how
computers should work. When you factor in the future (and evolving an existing
codebase is a huge problem) we have to assume any understanding we have now is
incomplete and flawed.

That's why things like libtool or autoconf evolved (or, better, were
"iteratively designed") to be able to grow and encompass varying and different
goals.

~~~
pmjordan
So what you're saying is that progressively tidying up a codebase to simplify
fulfilling current requirements is always a bad idea because you might remove
something that might make a hypothetical future requirement easier to fulfill?

~~~
rbanffy
No. I'm just reminding ourselves we don't know what you'll need in the future,
that any decisions we make now are subject to change down the road and that
it's foolish to assume we can design now what we'll be using ten years from
now.

~~~
flatline3
Dynamic linking has not changed in substantially interesting ways in the past
10-15 years.

We understand the pitfalls and complications, and can come up with some
solutions on how to handle them in a forward-thinking way.

If things break in the future, we can revisit this then.

In the meantime, libtool is one of the most ridiculous time-sinks I've ever
had the displeasure of working with.

~~~
mcguire
Compiling and linking a library for XLC/C++ v7.0:

[http://publib.boulder.ibm.com/infocenter/comphelp/v7v91/inde...](http://publib.boulder.ibm.com/infocenter/comphelp/v7v91/index.jsp?topic=%2Fcom.ibm.vacpp7a.doc%2Fproguide%2Fref%2Fcompile_library.htm)

------
justincormack
I think FreeBSD, and the Linux distributions do try to cater to too many
different people, and quality and coherence suffers a lot from this. I think
we can get past this though. The culture of testing and good code is on the
ascendant again in many quarters. You need more people to understand build,
packaging and distribution better, sure. You also need autotools to die, as
the use cases for it are mainly dead. You can generally write portable code to
the systems that matter if you want to now, and it just works.

A lot of the problems are due to poor integration between languages, so for
example the JVM people have reimplemented almost everything, as have the C++
people.

~~~
phkamp
nodding violently.

------
yungchin
"Later the configure scripts became more ambitious, and as an almost
predictable application of the Peter Principle, rather than standardize Unix
to eliminate the need for them, somebody wrote a program, autoconf, to write
the configure scripts."

I'm not sure I understand how the Peter Principle applies here? Autoconf seems
a rational solution in the economic sense: to standardise Unix, you need to
have lots of influence to effect buy-in, whereas to write autoconf, you need a
big dose of hacking talent. If you have the latter and not the former...

~~~
caf
Right, libtool is the same - the people that wrote it weren't in a position to
demand that all the UNIX-likes out there standardise their ld flags, so they
routed around the problem instead.

~~~
hollerith
>the people that wrote it weren't in a position to demand that all the UNIX-
likes out there standardise their ld flags

Agree, but once Linux became the dominant Unix-like, the major Linux distros
like Debian and Redhat were probably in a position to replace uses of libtool
in upstreams with a distro-wide standard for ld flags.

~~~
caf
They could, but what would be the advantage? Such patches couldn't be accepted
by upstream, and the Debian/Redhat source packages are mostly only built by
Debian/Redhat maintainers, so would the juice be worth the squeeze?

~~~
hollerith
I don't claim to know what other upstream maintainers would've done, but if I
had been one in the 1990s, and Debian and Redhat had agreed on a std for ld
flags, I would've announced that libtool would be removed from my project in
24 months, so other Unix-likes should get on the train.

------
lmm
Autoconf is an easy target for these kind of rants, but you know what? It does
its job, and it does it very well. The ratio of autoconf to non-autoconf
programs on my system is probably 10:1, but the ratio of build problems is
something like 1:20.

If anyone ever managed to write a genuinely better build system, the bazaar
would let it rise to the top; the gradual rise of e.g. cmake is testament to
this. Trying to impose one solution top-down, e.g. the LSB standardization of
RPM, has a far worse track record than letting the bazaar do its thing.

~~~
tedunangst
When OpenBSD replaced GNU libtool with a home grown perl version, it was so
much faster I believe it literally cut days off machine time off a full ports
build. For smaller packages, with tiny C files, running libtool.sh takes
longer than running the compiler does. The majority of build time for some of
those packages is still running configure, testing for things like <stdio.h>,
which the package provides no workaround when missing. The OpenBSD project
alone has spent _years_ of machine time to running configure and libtool.

As for doing its job well, the failure mode of configure "you can't build
this" is abysmal. Just give me a fucking Makefile, I'll fix it myself. I love
packages that come with Makefiles that don't work. I pop
"-I/usr/local/include" into CFLAGS, run make again, and boom. Done. Trying to
do the same with configure? Forget about it. --with-include-dir or whatever
doesn't work, because it's really running some bogus test in the background
which expects /bin/bash to exist and so on and so forth.

~~~
lmm
>When OpenBSD replaced GNU libtool with a home grown perl version, it was so
much faster I believe it literally cut days off machine time off a full ports
build. For smaller packages, with tiny C files, running libtool.sh takes
longer than running the compiler does. The majority of build time for some of
those packages is still running configure, testing for things like <stdio.h>,
which the package provides no workaround when missing. The OpenBSD project
alone has spent years of machine time to running configure and libtool.

Sounds like the bazaar in action. I hope they succeed, but I know two gentoo
projects that tried to do the same thing and were eventually abandoned as
unworkable.

>As for doing its job well, the failure mode of configure "you can't build
this" is abysmal. Just give me a fucking Makefile, I'll fix it myself. I love
packages that come with Makefiles that don't work. I pop
"-I/usr/local/include" into CFLAGS, run make again, and boom. Done. Trying to
do the same with configure? Forget about it. --with-include-dir or whatever
doesn't work, because it's really running some bogus test in the background
which expects /bin/bash to exist and so on and so forth.

Sounds like you know make better than you know autoconf; I find it easier to
fix autoconf problems on the rare occasions when they fail.

~~~
tedunangst
Familiarity with make may be part of it, but I picked up that level of
familiarity in about five minutes. Common problem: program adds -ldl to linker
flags (doesn't work on OpenBSD). Makefile fix: vi Makefile, /-ldl, xxxx, :wq.
Done. Autofix: I dunno, but I'm sure it involves a lot more typing. Most
makefiles are simple, most makefile problems and their fixes are simpler
still. The dumbest thing that could possibly work often does.

------
diminish
I write this as a thankful and proud user of Varnish; which =I assume= phkamp
would place as a cathedral type of software. I agree with; quality requiring
someone being responsible, importance of code reuse, and the observation of
full 10 years of dumb copy/paste (though i believe in copy/paste). Eric
Raymond's distinction of Cathedral/Bazaar is somewhat a useless dichotomy, on
the problems mentioned. Here are 2 examples how Varnish could do better
(independent of being developed as bazaar or cathedral)

1) Varnish does not support SSL, which complicates deployment architectures;
and part of the the blame goes to lack of a good ssl implementation to copy
from or to write an SSL proxy; <https://www.varnish-
cache.org/docs/trunk/phk/ssl.html> I find this as an example of why neither
bazaar nor cathedral based approaches yielding something "acceptable" in past
15 years, to copy and paste from.

2) Varnish VCL, as a DSL looks as obscure M4 macro language and the functions
are not well planned; and another guy (from the bazaar) would design some more
coherent language.

=> To my mind the culprit is the scarcity of monetization opportunities for
infrastructure components, which pushes talent (candidates for "being
responsible") to elsewhere where money is. I mean, nginx/varnish developers
should earn 100s of millions; according to their merit. People are making
billions by using ruby, python, rails, sinatra, django, debian, varnish,
nginx, openssl, and hundreds of libraries for libtool... but their everyday
creators don't get enough reward back to feel responsible. Passion and hobby,
benevolence or PR are the only drivers for most infrastructure and library
development. Some developers lose their faith in a man's lifespan, fight back
or quit in full despair.

------
anvandare
A restatement of the old dichotomy: the rebel alliance of young, creative,
anarchist hackers who are more interested in fun versus the empire of old,
methodical, hierarchical business-programmers who bean-count every line of
documentation and analysis.

~~~
mjn
Other related, though not entirely equivalent ways of putting it: the Richard
Gabriel distinction between the 'MIT' and 'New Jersey' approaches, and the old
debate between scientifically optimized central planning versus the chaos (and
possibly emergent order) of the marketplace.

------
artagnon
I doubt the dichotomy is between the cathedral and the bazaar.

All software projects have a list of gatekeepers or maintainers, who decide
which changes should go in and which ones shouldn't. Some of them have a long
list of people with commit access (like Subversion), while others have a
handful of maintainers (like Linux). Some of them have grand roadmaps and bug
trackers (like Firefox), while others have no agenda or bug trackers (like
Git). Some projects have many dependencies, use libtool and a lot of auto-
magic (like Subversion), while others are still adamant about not having a
configure script, maintaining a handwritten Makefile, and having 1~2
dependencies (like Git).

From what I've noticed, as a general rule of thumb, the projects with few
strict maintainers are better off than those who give away commit access
easily/ have lazy maintainers. It seems obvious when I put it like that.

The way I understand it, the cathedral model is about carefully planning out
the project, assigning various people chunks of the total work. The bazaar
model is open to accepting random contributions from a wider audience, and
runs on a looser agenda. It's just that the cathedral model works better for
certain kinds of software (games, office suites, professional audio/ video
suites), while the bazaar model works better for other kinds of software (web
servers, version control systems, development toolchains). When it comes to an
operating system, having a curator decide APIs and standards (OS X) certainly
wins over an mashup of billions of packages (Debian).

~~~
npsimons
_having a curator decide APIs and standards (OS X) certainly wins over an
mashup of billions of packages (Debian)._

That's your opinion, and you're entitled to it, but it certainly doesn't make
it true. To me, Debian has always been the most elegant of operating systems,
with a lot of thought put into how things are laid out and making sure that
literally thousands and thousands of programs play very nicely together, not
to mention compile and run properly on a multitude of architectures. And while
I'm sure you can drag up some anecdotes about how Debian didn't detect some
guy's wireless chipset, I can find complaints of things in OSX (right in the
replies to this article!) that I've never had a problem with in Debian. Not to
mention I don't like being told what I can and cannot do with my hardware.

~~~
artagnon
True, I use Debian myself and can't stand OS X. I meant that for an average
end-user, OS X makes more sense.

------
engtech
I hadn't realized that autoconf used M4. M4 is amazingly hard to work with,
mainly because it uses ` and ' as delimiters, making it very hard to even read
unless you have syntax highlighting set up to show those quotes as different
characters.

Here's an example from the web:

    
    
       define(`START_HTML',
       `<html>
       <head>
         <meta http-equiv="Content-Type" content="text/html;
          charset=iso-8859-1">
         <meta name="Author" content="D. Robert Adams">
         <title>$1</title>
       </head>
       <body text="#000000"
         ifdef(`BACKGROUND_IMAGE',
               `background="BACKGROUND_IMAGE"')
         bgcolor="#e5e5e5" link="#3333ff"
       vlink="#000099"
         alink="#ffffff">
       ')

~~~
pdw
m4 allows you to redefine the delimiters (yes, it's that crazy). Autoconf uses
square brackets.

------
blinkingled
It's a matter of perspectives. The article author's one is that of taste. The
bazaar folks' (including web startup folks)is that of practicality. To
illustrate using the same example from the piece - so what if a bunch of
crypto code is copy / pasted? It's all out in the open - if anyone ever comes
up with an actual problem with the code copy / paste, how hard can it be to
fix it?

Matter of the fact is that the world doesn't have a critical mass of people
with agreeing tastes and fundamentals to get them to come together and build
something moderately complex.

What we do have however is a world full of reasonable people who just want to
build something and make it work so they can solve some problem of theirs.
There are also people who will happily reuse what the folks before them built
- improving it in the process.

If we were to impose a high barrier to entry for people to code and design
systems in absolute best possible way - we are effectively saying only a few
will build software for the whole world. That'd be a net loss IMHO.

The point about bad design is worth arguing when software doesn't do what you
want it to do, i.e. fails practice. But that is mitigated by the fact that
individual stall in the bazaar does enforce some design, some testing, some
sort of sanity to ensure it at least works most of the times.

Is it a sad situation - yes, we would be much better off with every programmer
being perfect. But so long as that is impractical the bazaar alternative works
in creating tons of fixable, mostly usable, continuously improved software -
for which I can hardly call it a lost cause.

~~~
cantankerous
I think the big problem with all the copy paste code is that you wind up with
many redundant dependencies and equivalent pieces of software in varying
qualities. Granted that it's open source, you can't rely on any kind of
competitive (ie market-driven) mechanism to improve all of these redundant
pieces of software over time. This is an inefficient part of how things are
done....but that's not to say there's an obvious, more-efficient alternative
to what we're doing now.

~~~
blinkingled
Absolutely - it is a problem in that it is inefficient and error prone. And
@DanBC - yes, no doubt people shouldn't be writing their own crypto (but to be
fair copy/paste is not akin to making their own crypto - it's just a wrong way
to reuse.)

But my point was that this isn't a "it doesn't work" type problem, this is a
"it can get better" type of a problem - and I've seen many instances where
these types of problems are discussed on the mailing lists of several projects
and people with an itch to fix it go ahead and do it. It just highlights the
differences in perspectives - get it to work first and even if we had to hack
it we can make the design better later on if there are enough people bothered
by it.

------
meta-coder
People are getting confused between two different categories for software.

Software License metric: [F] Free and Open Source, [P] Proprietary/Closed
source

This metric can also be modeled as a continuous variable rather than a
discrete variable. But let us stick to two values for simplicity.

Development model metric: ranges from extremely [C] Cathedral-type,
.........., to extremely [B] Anarchism/Bazaar-type

FOSS proponents don't care about development model as long as it's FOSS.

Let [x][y] denote the Software License metric (x) and Development model metric
(y) of a software project.

Observations:

1\. [P][C] is the combination that FOSS proponents hate the most.

2\. [x][B] where x ∈ {F, P}; is less peer-reviewed (anyone can commit
anything), so less accountability/responsibility, highly decentralized, so no
guarantee of quality.

2.1. [P][B] sounds like a contradiction!

2.2. [F][B] Poul-Henning Kamp seems to have problems with this kind of setups.

3\. [F][y] where y -> B (i.e. closer to [B] than it's closer to [C]). Mostly
same as [F][C] except that it is mildly better.

4\. [F][C] In this setup, the cathedral authority is the bottleneck in
improving the project.

5\. [F][y] where y -> C (i.e. closer to [C] than it's closer to [B]). This
kind of setup "works like a charm!" See the overall success of GNU/Linux in
the industry! There are some people who act as maintainers of Linux, but come
on, you too can become one! The more popular such a software is the more
thoroughly it is reviewed. "Given enough eyeballs, all bugs are shallow."

------
daa
This comment thread is a bit depressing, but ignoring that.

One of the bits about the piece that has me scratching my head a bit is
whether the mess that is dependency management in OSS operating systems (and
generally OSS software distribution models) matters _enough_. While I very
much share the author's reaction of "can't we do better?", it also feels that
optimizing that mess isn't just a design exercise as much as a social one,
because OSS isn't really a bazaar but a constellation of connected more-or-
less-cathedralish bazaars. And I don't know that this mess is deeply
problematic (although the author does find some egregious issues).

Design happens at a specific scale, and some scales reward investment in
design more than others. Designing a chair that can be mass-produced is more
effective than designing a room, which can't. One could argue that designing a
self-contained piece of software (e.g. the Python runtime) matters more than
designing a deliberately open system (e.g. the ecosystem of Python libraries),
and that the alternative (random competition, forking, etc.) is Good Enough.

[for carbon dating: had my first IT job in the early 80s, as a teenager]

------
sedev
"Quality happens only when someone is responsible for it."

I admire that as an example of good headline-writing: it tells you one of the
important points that the article wants to make. I find it interesting to read
this piece in light of recent discussions about Worse-Is-Better vs. The-Right-
Thing: I read PHK as arguing definitely from a The Right Thing perspective. I
find his argument appealing, but I'm not entirely sure that I'm persuaded by
it. On the appealing side, I look at Apple, Python, and emacs: things I use
every day, things shaped unmistakeably by someone with vision, authority, and
taste, and think that there's got to be something to this argument.

------
davidw
> Brooks offers well-reasoned hope that there can be a better way.

I'd prefer to see the better way, in terms of a working system, rather than
read about it. That doesn't mean something that's better than, say, Linux in
whatever niche: that's not _too_ hard. I mean something that can be used and
repurposed for things the original authors had no idea about.

------
mikecane
I'm not a coder. And that piece illustrates why Maemo went nowhere with the
general public and why Nokia's Internet Tablets failed. I tried to download
software only to find out they required "dependencies." Then I had to chase
those down. And once I did, I found the software to be buggy and unreliable --
and that's why people go buy iPads. They don't want to jump through so many
hoops to get so little. The journey is no reward.

Edited to add: See "The Paradox of Choice" by Barry Schwartz too.

~~~
Sharlin
That's why you need a curated package repository. I don't have to even know
that "dependencies" exist to use Debian or Ubuntu; the package manager is nice
enough to inform me that it determined that these-and-these other packages are
required to install the package I wanted, and that it's now going to
automatically install them for me, but that's just details that could be
easily abstracted out.

~~~
flatline3
Curated package repositories are just a band-aid on the disaster. They improve
things by centralizing the ridiculous and unnecessary cost of dealing with the
mess.

It would be better to simply not have that cost.

~~~
snogglethorpe
Eh? The OP here was lauding the _ipad_ as some sort of "solution" to the
"mess," but Apple's app-store is pretty much the purest and most absurdly
restricted example of a "curated package repository" around!

Debian's (and Ubuntu's, etc) package repositories are a breath of freedom and
flexibility by comparison...

~~~
flatline3
Curated Linux package repositories are all about making painful things
slightly less painful, including:

\- Managing complex dependency graphs. There are often incompatible changes in
the nodes of the dependency graph that require wholesale updating of every
node linked to the changed one.

\- Managing complex installation. As per UNIX standards, the files which
comprise a single software package are spread across the file system in such a
way that it's effectively impossible to clean them up without a package
manager properly tracking them.

Apple's "curated package repository" is something else entirely.

------
anuraj
The bazaar was possible because cathedrals like Unix existed which were robust
enough to gloat over failures of sound design on part of newbie programmers.
Yes, we are now coming to increasingly complex systems in the web too, which
are still held together with nothing more than duct tape. Scripting languages
which are not type safe, cut and paste and tolerant browsers which hog memory
are all manifestaions. Not a great way to build a huge edifice I would say.

------
peterwwillis
It seems like FOSS and Unix-related software development was never held
accountable to customers so it never made uniformity - or even reliability - a
requirement. If something doesn't work, it's not my fault! It's one of the
thousands of other products I depend upon that aren't shipped with an
operating system that's to blame. _It worked for me_ is the cry of the guilty
developer.

You'll notice that Windows development follows a model based on at least the
principles that code isn't cheap and you are accountable to a customer. Even
Freeware is developed with a mentality that ensures the shipped application
will work as well as it can without the customer needing to do anything.
Duplicated code becomes a requirement (when you can find code to duplicate)
and dependencies are non-existent.

The idea that you have go get some other product and install it to make a
Windows app work would be absurd to a user. Yet in the FOSS world, it's absurd
not to assume this. There's also some expectation of payment from one of these
users. Could it not be that the quality and usability of FOSS could benefit
from more commerce?

~~~
phkamp
You're not old enough to remember how "InstallShield" came about, are you ?
:-)

~~~
peterwwillis
Nope. Care to enlighten?

(Regardless of what may have prompted its development, its use since around
1992 doesn't really discredit my point... It takes care of things for the user
that FOSS doesn't deal with, or at least not in a portable way)

~~~
phkamp
InstallShield was a created by a startup, to give windows-apps a non-insane
way of installing themselves.

Microsoft was forced to buy them, almost at gunpoint, by 3rd. party software
developers.

~~~
peterwwillis
Cool. Now if only FOSS would adopt a non-insane way of installing software...

~~~
acdha
"apt-get install <name of package>" is about as good as software installation
gets

~~~
peterwwillis
Actually there's much better, but none that are freely available. And it's
still distribution-specific which is retarded.

------
glaurent
The fundamental flaw in CatB is that it was essentially an adaptation of ESR's
libertarian creed, that Free Market is infallible and would always boil down
to the best optimal solution. While that may work on the scale of a single
project, it fails when applied to a full ecosystem, and thus we get dozens of
competing projects aiming at the solving the same problems (WMs, desktops...).

It also postulated that code being open would incite devs to be as their
technical best, in order to gain peer respect. That didn't happen either,
quite the contrary the OSS community has proved to be rather conservative and
traditional, Unix being seen as "The Right Way", not to be deviated from.

In short, the Bazaar model completely failed in its promise to always let the
best solution win. What we got instead was perpetual chaos, and less than 1%
of the desktop share.

I attended the 1st GUADEC back in 2000. If at that time we had known where
we'd actually be 12 years later, we'd all have left in disgust.

I moved to OS X in 2008, my only regret was not doing it any sooner.

------
etrain
Serious question for phk: Despite your rant, why do you continue to use
autotools in Varnish? Is it because libraries that Varnish depends on require
autotools? Or did you simply follow convention?

~~~
phkamp
The reason varnish uses autotools is that I delegated the build/release
infrastructure to somebody else.

From my own time as FreeBSD release engineer I know that task must come with
freedom to pick tools and methods, and therefore it does.

That doesn't mean I have to like the choice, but I have respect it, (or take
over the build/release responsibility myself.)

~~~
emmelaich
But pretty much everyone is the same boat. NOBODY likes autotools, but until
something comes along which is better and does the same job, we'll switch.

Don't underestimate the size and demands of the job though.

------
awongh
It seems that people would like to think that cathedral > bazzar is like
waterfall > agile, but I'd also posit that it can mean walled garden > OSS...
the iphone is a beautiful object, I am also post 90's so I've never been
involved with a cathedral project, but I see his point.

On the other hand, why can't I watch a flash movie on my iphone? Sometimes
it'd be convenient to do that. I think it's funny that there is no engineering
restriction on being able to do that, it's simply an effect of cathedral vs.
bazaar culture at apple. In fact people forget that the true cathedral would
be an iphone without the app store, as Jobs had originally envisioned.

I definitely don't agree with the assumption that the cathedral is always
better and preferable.

------
hollerith
Thanks, phkamp, for the best explanation I've seen for why I switched from
Linux to OS X. I was employed in an IT job before 1990.

------
gruseom
Here's an optimistic thought: one great company can change things. Google's
engineering-driven culture did that. And Apple has made people care about
design, albeit only the surface-aesthetic sense of "design", which is what
they identify with Apple despite Jobs' dictum "design is how it works". Well,
suppose a new company came along that succeeded massively by means of good
software design. A lot of things would turn on their head. People would have
to stop saying "that's not feasible" or "things are too complicated" or
"market forces demand otherwise", just as Apple's success has caused them –
against all odds! – to have to stop saying those things about elegance and
beauty.

Such a company could have a profound effect because many of the best and/or
most aptitudinous (sorry) programmers would be inspired by it. Right now we
have no great example to point to, just a bunch of intuitions and (by now)
raggedy traditions that we barely understand, which is what I take to be the
point of the OP. But great design feeds the soul in a way that the steaming
landfills many of us spend our days adding code to never can. So there's a
fair amount of talent out there waiting to be kindled. One bolt of lightning
and who knows.

Yeah it hasn't happened yet, but it's easy to imagine how it could. Historical
accident plays a role in such things.

------
typicalrunt
There's an element to that article that is complaining about the process that
Unix took to get to where it stands now. I feel the author needs to skim some
books on evolution to see that there is nothing inherently wrong with this
approach.

For instance, his example of Unix is not unlike any large organism who,
through years of evolution, has many structures that are no longer needed
(e.g. appendix) or structures that seem unnecessarily complex (e.g. recurrent
laryngeal nerve) when our 20/20 hindsight can design a better solution.

I didn't find too much advice in the article, but I wish he had proposed
something better. Even life can potentially create a mutant that contains no
archaic structures in it, but it will only last if it passes a fitness test.
Why doesn't he try rewriting some of the gnarled pieces of Unix code to make
it better (like removing the need for autoconf)?

~~~
dllthomas
> I feel the author needs to skim some books on evolution to see that there is
> nothing inherently wrong with this approach.

This is broken thinking.

You encountered,

"Process X has attribute A, and this is bad."

and you respond with,

"No, because Process Y also has attribute A."

This is only valid if we are either 1) guaranteed that there is nothing bad
about process Y, or 2) considering processes X and Y as alternative options
whose merits we are comparing.

Dispensing quickly with 2 - we are considering "Cathedral-mode development"
vs. "Bazaar-mode development", not "Evolution" vs. "Bazaar-mode development",
so this is not applicable.

Without further support, option 1 strikes me as confusion of is-ought. It _is_
the case that evolution works this way, so any design process _ought_ to work
this way. There is, however, plenty "wrong with" evolution as an approach to
producing designs for a given purpose. It is horribly inefficient compared to
design, suffers from greater path dependence, doesn't keep notes about systems
that may have worked that went extinct for unrelated reasons, etc. The
attribute A that you are comparing is in fact one of the things "wrong with"
evolution and "wrong with" Bazaar-mode development.

(Note, of course, that a problem with evolution in terms of producing optimal
designs for our goals is not the same thing as a problem with evolution as an
explanation of the world around us.)

------
GnarlinBrando
All of this is just such antagonistic linkbate if you ask me. No offense to
OP, but it really just seems like oh I had a controversial thought so I'll
play up two sides.

Maybe one isn't better than the other? Maybe they are just different kinds of
things and some people work better in one or the other. Maybe some projects do
to. We live and work in dynamic environments and there is no big T Truth in
design. People change, the requirements change, the market changes.

Indulging in this kind of petty software holier-than-thou while highlighting
only flaws and failures as defining characteristics of another generation is
only destructive for the entire community.

If OP was serious this would have actually been about showing a new generation
of coders good cathedrals, and not just banal whining.

------
gabordemooij
According to Wikipedia The 'Cathedral' and 'Bazaar' are supposed to describe
open source project development models. So, how can MS Windows, Office and OS
X / iOS be 'Cathedrals' or 'Bazaars'? AFAIK none of them is (fully) open
source.

From Wikipedia: "The Cathedral model, in which _source code_ is available with
each software release, but code developed between releases is restricted to an
exclusive group of software developers. GNU Emacs and GCC are presented as
examples."

Sounds more like Android is a 'Cathedral' and Linux is a 'Bazaar' ?

<http://en.wikipedia.org/wiki/The_Cathedral_and_the_Bazaar>

------
cellularmitosis
I wrote this analogy to explain this situation to my Facebook friends:

Hypothetical cooking analogy for the non-techies: Back in the day, if you
wanted to cook up a fancy meal, there was only one butcher in town, and he did
things the "right" way. He only used one variety of beef, because it was the
"best", and he only sold the "best" cut of that beef. Similarly, there was
only one farmer, who only sold the "right" crops, because they had the "right"
flavor. When you wanted to cook a fancy meal, you didn't really have any
decisions to make, because there was only the "One True Way": one "correct"
cut of meat, one "correct" vegetable selection, etc.

Since then, an explosion of variety has occurred. However, perhaps we have
swung too far in the other direction. Rather than choosing among the 6
different makers of pasta sauce, you now must choose among 6 varieties of
pasta sauce from each of the six makers. This way lies decision fatigue.

However, the community of grocers got together and came up with an ingenious
artificial intelligence agent built into a pick-packer system. Now, you don't
need to even know a single brand name of pasta sauce. You can simply arrive at
the store, tell the system what recipe you are following, tell it that you
prefer your tomatoes from Spain, your beef the organic grass-fed variety, and
your garlic from the Mediterranean, but you'll take whatever variety of pasta
is local and most fresh. The system then assembles your grocery cart
automatically for you, and you can choose to remain blissfully ignorant of all
of the decision-fatigue inducing choices which occurred under the hood.

However, if you are of the crotchety-old-man sort, and you "Only buy my rib-
eyes from Lambert and Sons, because he's the only butcher who does it right",
you can of course choose not to use the AI pick-packer, and you can take it
upon yourself to select the "best of breed" ingredients for your meal, and
pretend that the explosion of variety never happened.

So here's the deal. Poul-Henning Kamp wrote an article lamenting the fact that
the explosion of variety happened and that the AI pick-packer was invented,
because now everything is a giant mess, and back in his day, there was only
one right way to do it, dammit.

------
revorad
_His most recent project is the Varnish HTTP accelerator, which is used to
speed up large Web sites such as Facebook._

The Cathedral speeding up the Bazaar? Oh sweet irony.

------
keeptrying
Nature does pretty well with bazaar style development and she usually wins
doesn't she?

There's a lot to be said about building via rapidly iterating around feedback.

~~~
wglb
But that process has a way of ruthlessly culling the herd. Does not seem to be
happening in this world of infinite backwards compatibility.

~~~
phomer
And I have a tail-bone, but no tail ...

------
pasbesoin
_Thirteen years ago, Eric Raymond's book The Cathedral and the Bazaar
(O'Reilly Media, 2001)_

Off by two?

Seriously, though, how do you get 13 out of that (unless calculating from when
the author conceivably began working on the book per se)?

I'm starting to read the article; I found mildly disconcerting that the first
sentence appears to make a mis-statement, but I don't expect that to
foreshadow the meat of the message. (We'll see. :-)

~~~
phkamp
The editors put in that reference, the essay was available before the book was
published.

~~~
pasbesoin
Fair enough. But a reader not familiar with this is going to wonder. (Maybe
none, given that this is in the ACM Queue.)

Maybe the following edit?

"Thirteen years ago, Eric Raymond's _essay and subsequent_ book The Cathedral
and the Bazaar (O'Reilly Media, 2001)..."

P.S. Done reading now. Enjoyed/appreciated your essay. Amongst other things,
perhaps useful for the younger / newer to the technology generation to
appreciate where this "configure soup" comes from.

------
ucee054
I wish there were an English word for _Baazaar_ ; we used to have the word
_Market_ but now that word means _aggregate demand_ IE the _job_ market not
the _fish_ market.

Also I wish we could understand "Cathedral vs Bazaar" as an _analogy_ and stop
comparing Notre Dame Cathedral and Istanbul Rug Market in this thread.

"Rug Market" software = I recompile my kernel to upgrade = Debian

"Cathedral" software = the central planning organization sell me my OS upgrade
= everything Apple

It should be obvious that Cathedral development can guarantee higher quality.
Critical systems get written in ADA and VLISP and run on QNX. You would be
_insane_ to to try to deploy a critical system in C++/Ruby on Debian, and,
depending on the circumstances, you might get arrested if you tried.

"Rug Market" however can scale to larger codebase sizes, because "Cathedral"
costs more per line of code. So "Rug Market" wins, when software correctness
doesn't _really_ matter.

~~~
zem
"bazaar" is pretty much an english word by now.

------
papsosouid
How ironic that this complaint of poor quality and lowered standards comes
from a FreeBSD developer who actively attacks people who care about producing
quality software. Cathedral vs bazaar isn't the issue. It is simply putting
quality as a priority vs not making it a priority. Which is why some
cathedrals produce poor quality software (freebsd), and others produce high
quality software (netbsd).

~~~
phkamp
I do ?

~~~
papsosouid
Remember when freebsd included a closed source binary blob to support atheros
wireless chipsets? And some people in the openbsd and netbsd groups decided
that since their OSes have quality standards and don't allow garbage like
that, they would make an open source driver? And then you yelled shit at them
at a conference and called Reyk a terrorist? It is one thing to hate everyone
who works an OS other than the one you prefer, that's just typical
childishness. But to attack them for having higher standards than you do? Yes,
I would say that puts you in the position of "not being one to talk about
quality".

~~~
phkamp
Lets just say that maybe I have a slightly different recollection about that
meeting ? :-)

~~~
papsosouid
"First talk was OpenBSD's wireless talk. Much rethoric about freedom and
little appreciation for the way the world actually works, as opposed to how it
looks like it works from the headquarters of OpenBSD.

Asked directly if they thought they could defend their reverse engineering of
for instance the Atheros HAL. The answer as I heard it was "Laws don't apply
to us".

I guess I'm not the right person to appreciate the valor of their crusade, but
somehow their rethoric reminds me too much about Rote Arme Fraktion and
similar terrorgroups of the seventies which tried to change society by bombing
political correct holes in it.

If OpenBSD aims to corner the paranoid/radical part of the market they're
welcome to it for all I care.

Doesn't sound like it is doing anything good for their wireless support
however."

I dunno, sounds to me like you bitterly despise anyone who dares to do things
correctly instead of doing whatever is expedient. OpenBSD's stance on the
wireless issue was correct, has been proven to be correct, and should be
admired. And I'm a netbsd guy, so I am "supposed to" hate openbsd.

~~~
phkamp
The fact that OpenBSD put any users and source-code holders in USA in direct
violation of USA (stupid!) laws on spectrum access is the context you are
missing here.

~~~
papsosouid
Even if we were to throw common sense and reason to the wind and pretend that
is even remotely close to the truth, it would in no way explain your
behaviour. You outright stated that "the way the world works" is accepting
closed source garbage that can not be verified, debugged or fixed. You think
the way the world works is compromising security and stability for
convenience. And you belittle people who do care about quality, as if it
somehow hurts you that such people exist.

------
cfront
In the article he uses the example of compiling Firefox.

This is a lengthy affair on any of the BSD's.

The question is, is it even worth it? I mean, what does Firefox give me that
is worth all that time compiling?

One has to have a big bloated browser to do online banking and similar things,
but it is entirely unnecessary for lots of other things, like YouTube. To get
content and play it, which are two separate tasks, you certainly do not need
Firefox.

That makes kids upset. They have invested all their energy into learning the
browser and how to manipulate it. 100% web everything. Client-server.

Anything that threatens these ideas is offensive.

Well, that's what's keeping back progress.

You will find not freedom and flexibility to explore the possibilities by
trying to do everything through a browser, over the web, and solely within a
client-server construct. Those are self-imposed limitations that today's
developers readily accept.

As phk says, he's writing for people who can think for themselves. If you
cannot think for yourself, if you're just good at repeating the same old
things: "OSS is great!" "The web is amazing!" "Google!" "Facebook!" then his
writing will not make sense to you and will probably e offensive.

If I'm not mistaken, he's implying we can do better. And the evidence to
support that is that we have done better. But it was a long time ago, and
bubble seeking programmers do not want to pay attention to history. Go figure.
Tha sad truth is most programmers today are caught in the mediocre moment,
thinking it's Nirvana.

Please don't wake me up. I'm enjoying this dream.

------
cfront
phk how dare you question autoconf, libtool and package collections.
Simplicity? Are you kidding? Nothing is simple. Simlicity is HARD. You are a
grumpy old man. Leave us kids alone, we're having fun.

End sarcasm.

I can compile my bloated BSD kernel loaded down with every available driver
I'll never need in a fraction of the time it takes me to compile Firefox: that
is, measured in _minutes_. On an underpowered computer it can take _days_ to
compile Firefox with all its supposed dendencies. I say supposed because no
one is really sure. It's too much work to check and really find out.

Anyway, negative coding has been outlawed. It's inconsiderate. Even if you
found something that could be removed, you would sacrafice every ounce of your
karma if you dared to remove it.

End more sarcasm.

The FreeBSD ports collection, like OpenBSD's or pkgsrc is black box.

Unfortunately, with that last line, I'm dead serious.

My solution to autoconf and libtool is to run make >log 2>log2 and then do
some sed transformations on the logs to produce my own "compile script".
Sometimes I even change libtool's script to make the "quiet" flag do what the
"verbose" flag does, so it can't hide anything it's doing. I then save my
"compile script" for the next time I need to compile the program on that type
of system.

One of my todo projects is to analyse the sum total of *BSD patches, looking
for patterns. One of my pet peeves is having to make patches just to compile a
program on the each BSD. Silly little MI differences that totally defeat
automation and require silly little patches even for the simplest of programs.

Anyway, you can't say that the youth has completely abandoned "traditions"
from ye olde UNIX, no matter how much they seem to hate "old things". They
stand by the old relics of autoconf and libtool year after year and even
vigorously defend using them.

The reason they can't replace these crusty old things is because they refuse
to put in the effort to learn how they work. If you want to build a better
mousetrap, first you have to understand how the old ones work, and
specifically how and where they could be improved.

