First: It would be a big help for this discussion, if we could have the informal convention that people who were employed in an IT job before 1990 marked their post. I think it would show a quite clear divergence of attitude.
Second: It's very obivious, that a lot of you have never been anywhere near the kind of software project posited in the "cathedral" meme, instead you project into the word whatever you have heard or feel or particularly hate. That's not very helpful, given that there is an entire book defining the concept (I belive it's available online on ESR's homepage, how about you read it ?)
Third: No, I'm not of the "either you are with us, or you are against us" persuation. The bazaar is here to stay, but having everybody in need of transportation buy the necessary spareparts to build a car is insane.
Fourth: Related to point two really: A lot of you seem to have little actual ambition of making things better, I guess that is what happens if you grow up in a bazaar and never even experience a cathedral. I pity you.
Second: I worked on OS X. In fact, I worked on Snow Leopard, and the start of Lion. What's interesting about that, is that Snow Leopard was the last version of OS X developed according to the "Cathedral" model. Also, while the Snow Leopard cathedral was being built, iOS was being developed firmly using the bazaar model...
Third: You know, I wonder if you've ever been to a bazaar before? I live in Turkey, where the bazaar is a way of life (and where one of the largest, oldest bazaars in the world is located). I've never found "spare parts" at a bazaar. What I have found is some of the highest quality jewelry, tapestries, rugs, and other hand-made goods you'll find anywhere.
Fourth: I think you're conflating "good quality"=>Cathedral and "poor quality"=>Bazaar. The only thing that distinguishes the Cathedral and the Bazaar is whether or not there is one single individual in whose head the only valid vision of the completed project exists. You might do well to read up a bit on the history of Kapalıcarşı. Throughout its history there were guilds to enforce authenticity and all manner of quality control mechanisms. It is possible to have a Bazaar and a very high quality product.
> The only thing that distinguishes the Cathedral and the Bazaar is whether or not there is one single individual in whose head the only valid vision of the completed project exists.
Core OS and related groups has long consisted of VERY different teams each working in their own fiefdoms with their own methodologies. There was a somewhat coherent vision overall, but everybody achieved it in their own way, with very mixed results -- OS X has some tremendously terrible code. Security framework, Installer team, mDNS, anything related to Server ...
To be honest, I'm not even sure what your point actually is.
The point is that, if one of the workers building the Notre Dame had gotten hot after a day of work and said "You know what we should include here? A swimming pool!" well...he probably would've been committed.
Coherent vision and a priori design are not the same thing. Apple has coherent vision. Bazaars can have coherent vision (should have coherent vision if they hope to be successful). But that's not the same thing as a Cathedral's a priori design...
A priori design can adapt to new ideas. I'd argue that's what Apple did/does, in many cases.
Likewise, I'd argue that the truly ad-hoc bazaar development is responsible for some of the worst ideas and bad code that can be found at Apple.
The historical lack of good centralized vision on aspects of the Core OS -- such as Objective-C -- has led to staggering missteps and ridiculous inefficiencies on behalf of both the framework and compiler teams. This has been to nobody's benefit and the sum result is clearly inferior to better-designed language work done elsewhere (eg, MS).
Likewise, the ability for applications teams to drive forward ill-conceived OS and framework hacks has led to some terrible long-lasting implementation failures, which is something a coherent top-down vision could have prevented.
However, the fact is that products can succeed despite of their poor implementation. Costs may be higher, bug counts may be higher, and user satisfaction may be lower, but that hasn't always stopped Apple from building successful products. Where I take umbrage is in the notion that there's a dichotomy -- either you do things poorly and let intellectual lazy engineers take the lead, or your product does not succeed. That's not accurate.
I don't think Apple is a good case study for your point.
Compare FreeBSD kernel design versus Linux.
kqueue vs. dnotify/inotify/???
Mach-descended VM vs. a string of linux-vms
BSD scheduler, ULE scheduler vs. how many different schedulers?
Linux churns through ill-conceived solutions to problems until they find one acceptable enough. FreeBSD grinds on one until it definitively works.
FreeBSD almost invariably winds up with the better solution, in less time. See kqueue, for example -- the foundation upon which Apple's GCD is built.
Planning ahead can lead to great things. The Notre Dame, and the dozens of other cathedrals throughout Europe are positively stunning...
I love and I hate kqueue. I mean, I love kqueue. I love the way it can be used from C, I love the way it's integrated in MacRuby...I spent the weekend studying Clojure's reducers and was dying to have some time to work on ClojureC just so I could implement reducers with kqueue...
I hate kqueue because, in all likelihood, I'll never get to use it in a production system, because it's not in Linux.
Cathedrals can be nice to look at. Bazaars are often more functional.
Except that FreeBSD is functional in production, so what is the actual problem? That market effects and accidents of history resulted in Linux becoming more widely adopted?
What does that argue for, exactly?
http://en.wikipedia.org/wiki/USL_v._BSDi was hugely crippling at a very critical moment.
In a similar vein, PostgreSQL lost out to MySQL in no small part due to:
- PHP supported MySQL out of the box.
- MySQL was slightly easier to get running.
If there's a lesson we ought to learn, it's that market success may be partially or fully disassociated from actual merit relative to other market entrants. We've seen this in commercial software. It would be foolish to think it doesn't apply to open-source.
And what happens if it never works?
libkse was an attempt to implement M:N thread scheduling. This is something that can, in theory, provide significant benefits over 1:1 thread scheduling, but in practice, is extremely complicated to implement and has fallen out of favor across the board: Solaris abandoned the approach in Solaris 9, and FreeBSD abandoned it in (IIRC) FreeBSD 7.
Concurrent to libkse, libthr was developed by David Xu, and was also included in the FreeBSD base system. It implemented 1:1 threading, is far simpler than libkse, and has replaced libkse as the default threading library.
I would argue that in this case, FreeBSD's cathedral model failed; M:N threading ideally would have never been attempted, and it was wasteful to attempt to implement two distinct threading libraries. However, considerable thought and expertise went into the work, and the work (and decisions around it) were not made flippantly or taken lightly.
It was simple a case where despite best intentions and best effort, the wrong choice was made. At the same time, in what some might call "bazaar-like", David Xu maintained libthr as an alternative. It remained there, ready for adoption as the default threading library, until such time as it became apparent that M:N via libkse was a dead-end.
This was a mistake, but no entity is perfect, and the success rate of FreeBSD's considered decision-making remains statistically high. In this case where something never worked, it was replaced with something equally well-considered and far more successful.
Moreover, compared to Linux's gross missteps with their threading implementations (including such gems as setuid() being thread-local as an implementation side-effect), libkse was a minor fender-bender.
There seems to be some underlying assumptions of omniperfection hung of the "cathedral" meme in these parts. That is simply unfounded in both theory and practice.
Second, cathedral vs. bazaar is not really about governance, but about architecture, and they are two separate power-structures, although most organizations, including FreeBSD, mingle them, to their own disadvantage.
We can argue if attempting M:N was a wise or a factually based decision, but that has nothing to do with cathedral vs. bazaar, certainly not in this particular case: What happened was that M:N came first and that was that.
Only once it clearly transpired that it could not sustain its promises in practice (partly because every thread programmer assumed 1:1) did the actual decision making aparatus of FreeBSD kick into gear.
In Linux, (which is much more cathedral than FreeBSD) or OpenBSD (at the time even more so), that decision would probably have been executed practically instantly, but in FreeBSD which is mostly consensus driven, it took some time (and angry words etc.)
But overall the libkse vs. libthr saga has almost nothing to do with cathedral vs. bazaar, because architecture was not the driving force at any time during that saga.
Indeed, throughout this discussion it has become clear that you regard anything with a sense of design and some amount of quality control as a "Cathedral". If you want to discuss the benefits of design and quality control, that's a perfectly fine discussion to have, but it also seems to be completely orthogonal to the original discussion of cathedrals and bazaars, which was wholly focused on the development process.
In fact, I wonder if your arguments would be better focused on the concept of "craftsmanship" (or lack thereof) in programming...keeping in mind, of course, that both cathedral builders and bazaar artisans have historically had a notion of craftsmanship.
I think it is plausible to describe the kernel as much more Cathedral-like, at least in parts where Linus is very strict about how components should be integrated. I can see where parent could have got this impression.
At the same time, there are parts of kernel that are bazaars. Virtualization/Hypervisor support is the area I'm most familiar with; the various hypervisor vendors basically crammed their paravirt calls and passthrough drivers in, without unifying the interface at all. Yes, Linus and others were very aggressive with maintaining the code quality, but less with the architecture.
There are other areas of kernel (particular drivers), where we've seen similar situations.
That said, given your response, I think perhaps still don't understand the meaning behind your cathedral/bazaar dichotomy.
While it's been years since I mucked about in Linux kernel space, I always remember it as being a very open and egalitarian community. Although I never made the attempt to contribute to any of the BSDs, I got the impression that they were much less open to newer contributors and divergences from "accepted wisdom" (this could be put more politely as being more discriminating (as in taste) and careful). That being said, the majority of decisions I was witness to on LKML appeared to be well-considered, and make no mistake that patches and design ideas were not just accepted willy-nilly. While Linux has stumbled across many missteps, I still personally prefer the wild possibilities and endless choices that eventually shake out a solution, at the same time keeping with a distribution (Debian stable) that filters out the issues for the end users (myself included).
I also find very interesting the parallels between this discussion and the "worse is better" essay, where UNIX was supposed to be in the "worse" camp, and now it's Linux.
It definitely is, in my experience. Most importantly, despite the obvious presence of "celebrities," even completely unknown people can jump into the middle of a conversation on an important topic, and if they have their shit together, will be accorded pretty much instant respect and be treated as an equal. There's very little sense of "needing to pay one's dues."
But you definitely have to have your shit together. If you don't, you will be quickly eviscerated.
It's really rather nice, especially compared to many dev communities where there's often much more sense of entrenched factions.
I was hoping someone would bring up OS X and how horrible its cathedral-born API became around 10.5, but I was born in '85 and therefore have no right to speak in this thread.
This article concludes that quality only happens if someone takes responsibility for it. Yes. You'll find no argument on that point from me. But cathedral's are not (edit: should say "not only") about one person taking responsibility for quality. They are about a priori design.
Quality control can happen ex post facto but creativity, once strangled, dies.
It seems to me that quality is about having someone on the top who is willing to give direction and vision to a project, instead of having every person who scratches their own itch full influence. Contributors are great. But letting everyone pull in their own direction doesn't lead to something that feels well engineered.
PHK seems to be calling this vision and cohesive design (a-priori or not) the "cathedral model". I agree strongly that for a project to work well and feel "high quality", it needs some source of a unified idiom.
If the suggestion of the article is that the only way to retain quality is to move back to the "single vision" world of a priori design, I'm sorry...that ship has sailed, the cat is out of the bag...whatever your favorite analogy, PHK is very right that the new generation has gotten used to not having instructions handed to them.
Of course, anyone is free to start a project with a single vision, recruit new members, and do their best to grow the project. I suspect, however, that such an effort would loose out to one that figures out how to develop a coherent vision without the need for a priori design.
Nobody belives in "a priori design" and I somehow doubt that anybody did. Brooks points out that the original publication of the "waterfall model" was meant as "how not to..." and people got that wrong.
But cathedrals are not about a priori design, they are about style, elegance, economy of means and coherency of design.
But my point in the piece is that the lost generation doesn't even know what a cathedral is in the first place, having grown up in the bazaar.
ESRs original prototype for the "Bazaar" was Linux. Do you feel that Linux is lacking in coherency of design? You refer to the dot-com bubble, but the problem with the bubble wasn't, I think, that it was the "Bazaar". Indeed, I don't think the dot-com bubble of the late 90s was characterized by much open source development at all!
It was, rather, consumed with "flashiness" and "wow factor". I definitely see the continued obsession with these things as a problem. I would say that, for example, much of the obsession with Node.js today is a consequence of this obsession. But that isn't a Bazaar.
It's a disco.
This is true. The OP has made several somewhat-arbitrary and vague assertions about what may or may not constitute a "Cathedral", on top of lambasting others for not agreeing with his definition ("Read Brooks Book")
Excerpts from the HN discussion :
"Windows and Office are actually not examples of cathedrals, because the architectural focus of Microsoft was not on software but on a near-monopoly market."
"...iOS is very much a Cathedral and has a designer and architect who cares and who is in control."
Does Apple not care about achieving market dominance? Did no version of Windows or Office have a lead architect who "cared"?
Frankly, the entire concept of a "Cathedral" - rigidly defined as it appears to be here - is a little nebulous and hand-wavy.
You're calling a ot of people out on using Microsoft as a cathedral example, so I assume you want the original bazaar-vs-cathedral reading of the paper. But if you do that, how can autoconf be a bazaar example, when it's GNU software?
With more open development communities there are tradeoffs, but often the result is that you can do stuff you simply wouldn't have the manpower to do in closed communities.
I think most of us would love it if every library and tool we used was pretty and elegant, but in the end, it's often better to have something, however much you'd ideally wish for it to be prettier.
Morever, a community in which different components are often the result of disparate teams with different thinking probably results in more attention paid to robust and simple interfaces between them, simply because that's the only way you can get everything to work. Systems composed of heterogeneous components with robust and simple interfaces between them, are, I think, a good thing. I suspect they tend to be more future proof than systems with everything designed by one person from the top down, because in the end, change is inevitable, so having to deal with heterogeneity from the beginning is an advantage.
It is not just that M4 is a horribly ugly, badly, inconsistent language. You also have about a million leaky caches in the build process to deal with. Autoconf rewrites the build scripts in three stages, each which is cached and also keeps an autom4te.cache directory around. So what happens is that you update something in the M4 build script and then have to spend hours hunting down why the change didn't change anything because some stupid autoconf cache wasn't flushed.
What autoconf has done is to separate free software developers into two categories. Those who can hack the source (the .c and .h files) and those who understand the "autoconf magic." It should be telling that people refer to it as "magic" - it's a fing build system, not rocket science.
The only reason this ugly piece of software has lived on is because people treat it as some kind of magical device and inertia. Configuring and building software is an easy problem. It gets even easier if you finally decide to drop support for 30 year old unices that no one uses. The autoconf team consistently refuses to do that which why the software wont ever become better.
CMake, SCons, Waf and probably half a dozen more tools all do a much better job configuring software than autoconf ever will. People really need to try those alternatives and I'm sure they will realize autoconf needs to die, die, DIE.
CMake, SCons, etc, all have their pros and cons. They are hardly paragons of virtue though. [Yes, even as a mere "consumer" of CMake (someone who occasionally needs to build packages that use it), I've been bitten and frustrated by it.]
As far as I've found, there really isn't any build tool that really gets everything right -- which suggests that the problem is harder than you suggest. [Although judging from the number of people who try to write their own build tool (most of which languish and eventually die, but cause some pain along the way), it's clear enough that many people think it's a simple problem...]
> autoconf needs to die, die, DIE
It doesn't matter for a "works fine" person how many faults their current tool has or how much smoother "your" tool solves all those problems, they won't listen. Logic doesn't work maybe because it is very hard for them to get used to something new. Maybe because if you haven't experienced any alternatives it is very hard to imagine that something could be better.
I think bazaar's like the Turkish one referenced above (with guilds and standards of quality) are the best of all worlds. but to get there, a guild needs a strong, opinionated leader. Torvalds perhaps. (PHK seems to fit the bill too :-)
So, what I suggest is the issue here, is too many people trying to build a cathedral in a bazaar - they want everyone to build it their way but cannot persuade enough others to follow them. As such the bazaar is full of many half-finished carpets instead of less, perfect carpets made by engaged teams.
Software needs many talented people, working together, mentoring new ones in "their way". A bazaar is great for getting different talented people to find each other and choose between many different projects. A Cathedral is great for encouraging apprenticeships, the promulgation of culture and so on.
I think that one of the best lessons to take away from this whole flame fest is that history is something to learn from, and not just blindly revere. But first, that requires knowing about history, which far too many people don't. The whole issue of whether cathedrals or bazaars produce better software is orthogonal, and needs to be addressed to different situations.
First, are there still places where cathedral style development is a very good idea, hinging on being necessary? Yes, but it has to be done right, and unfortunately, cathedrals aren't (usually) very flexible. The Mars rovers were all cathedral programmed and work pretty darn well; on the other end of the spectrum you can find all sorts of big projects that went awry, even with "coherent design".
Second, is cathedral design necessary for all software today? No. I'm fortunate enough to have the perspective of a second generation programmer; my father started in middle school by sending punch cards to the university and would get his results a week later. Careful planning was not just a good idea then; it was absolutely necessary. These days he has a computer with more power and connectivity in his pocket than was available when he was born. His turnaround time for seeing results on software changes these days is well under five minutes.
So the computing industry has come so far so fast that it's not just unnecessary to make grandiose plans in advance, it may actually be a bad idea in a number of cases (what was it Paul Graham said? "If you want a recipe for a startup that's going to die, here it is: a couple of founders who have some great idea they know everyone is going to love, and that's what they're going to build, no matter what.").
It's kind of nice that anyone can just pick up a computer and start writing software these days. Sure, sometimes it's scary too, to think of all the inefficiencies, and wrong results and security holes that plague software. But it's so nice and liberating to think: "you know what? I can do it better, I have the freedom to try to make something better." And you can develop all cathedral style if you want!
Could software be improved? Definitely! Should people learn from history? Yes, including the mistakes. Would more design and coherent vision help make better software? Maybe; go ahead and prove it (a quick note: unfortunately, the market doesn't select the best software or designs, so you will have to prove it some other way, but being successful market wise is a good start).
I will tell you, though, that Snow Leopard missed its ship date by 8 months (of course, you never heard about that because Apple is smarter than to announce an OS ship date before it goes GM). It was also nearly un-usable for about a month, and only marginally usable for two months beyond that, during development. Lion suffered from none of these issues...
I guess post Snow Leopard era OSX and Apple apps just feel cheap, unpolished and rushed. It actually feels a lot like Windows, where quality of different parts of the system varies wildly.
Examples: Finder and anything file browsing related beachballs much more than before (particularly when dealing with network shares). Visual design seems to have taken a hit - everything seems to just consist of either different murky shades of grey, or jarring skeumorphism. The 10.6-era AHCI kernel panic hasn't happened for a while now, so maybe that was fixed in a newer revision of 10.7 or 10.8, but userspace stability seems much worse. Some of it is crashes, some just weird UI glitches that require a restart or Force Quit. Old problems (like Mail.app reliability or the heap of shit that is iTunes) are ignored completely in favour of sexing up the UI. (the runaway dynamic pager issue FINALLY seems better on 10.8, but again we get loads of other regressions)
Add to that what you say: the "improvements" leave most people I know completely cold or even get in the way, and that's not just power users/developers. Since we're "paying" for the (subjectively) unnecessary bling with regressions, the overall impression is that of a negative change.
Not that any of this is particularly relevant to the development model. You can end up with festering layers of crap with either model if nobody feels responsible for overall quality. I suspect the objective for Lion was "make OSX look kind of like iOS and get rid of all of that GPLv3 software" and that's basically what was achieved. I guess bug-free software isn't good business, Microsoft were printing money for decades.
You can say that again. Frankly, after using Mac for 10 years, this is my last one.
Hackintosh or something even worse (and I am with you 100% on the horrible state of OS X) -- I wouldn't want to discount the quality of the hardware.
In fact, it actually reminds me a lot of the change from FreeBSD 4.x to 5.x ... things worked, people were happy ... and then boom.
That was when all of my desktops and laptops stopped running FreeBSD and I "switched" to the mac. I am fairly certain that once running SL stops being practical, I will switch again.
Nothing against a good conspiracy theory, but if you truly think this is intentionally planned then this thought is a bit … dumb.
It does feel that Apple's development model is changing to an annual release cycle, wherein larger/systemic bug fixes are deferred to the next major release, rather than in a point release. In fact, most OS X release these days feel like "minor tweaks that make consumers happy + a truckload of bug fixes and plumbing". The fact that they deferred so much 10.0-era plumbing to 10.6 and beyond is indicative of why there's a lot of regressions, IMO.
Minor irritations like reworked Spaces and full screen mode making a second monitor pointless are irritations, but I wouldn't regard them as reflective of the quality of Lion and ML.
Whether products are developed according to what users want vs what builders want has almost nothing to do with bazaar vs cathedral.
Indeed, many bazaar-style projects (the linux ecosystem in general, gimp, etc) are almost cautionary tales about "builders creating what they want for their own niche purposes and if anyone disagrees they're free to fork off".
Their inscrutable beauty is buried under tons of libraries nobody will ever touch for fear of breaking 20 years of development efforts, exactly like what happens in the Unix world. Their move to the 64-bit world was painfully slower than what their fellow merchants accomplished in the Unix bazaar. They still provide compatibility layers for programs built with technologies that have been thought of as extinct, like monks still praying to the gods of ancient Greece. Whenever they went for the "total reuse" mantra, they built terrible and insecure specifications (DCOM) that still saddle us 20 years later. And let's not even talk about portability, which is anathema: to each his own Cathedral and his own Faith, touch ye not any unbelievers!
So yeah, making mistakes and keeping around the cruft is something every long-running IT project can experience. Unixes are, arguably, the longest of them all, so the ones that naturally tend to show it more. Besides, there's a whole new world of applications to be built out there, if we were rewriting libtool every three months we'd move even more slowly than we do now.
<ad-hominem>Oh, and I wish I could say your rant is unbefitting of professionals of your age, but I'm afraid it's actually quite matching the grumpy-old-man stereotype you're clearly striving for.</ad-hominem> Hey look, I can do ad-hominem too, and I was born in the late 70s!
You diagnosis of its qualities are spot on.
"ad hominem" means to "attack the man", ie: I single identified man, saying some generalities about identifiable groups is not "ad hominem".
And yes, I am a grump old man, and a surprisingly cheerful one at that.
Your remarks are essentially Ad Hominem regardless of whether you single out 1 man or 100 individual men.
The words "cathedral" vs. "bazaar" have obviously gotten too vague for people to meaningfully argue about. But "coherent design vision" has a much clearer meaning, and an enormously important one.
I think the pendulum is bound to swing back from incoherent, hypercomplex software, because at some point the reductio that we've got today simply won't be able to adapt. Who knows what will trigger that, or when. But intelligent people will always care about simplicity, beauty, efficiency, and the other qualities that come from good design. And among the ignorant programmers there are more than a few whose eyes light up when they are finally exposed to good design, and want to learn to work that way. I'm an example. So there's no point of no return here.
The UNIX design philosophy -- small utilities loosely joined into a coherent whole -- is a coherent vision for a platform, under the definition you've repeated. And the Microsoft Office design philosophy, if what you're saying is true, is a bazaar because it's "incoherent."
If the UNIX ecosystem is a Cathedral and the Windows ecosystem is a Bazaar, I think we can safely say that you're not using the words that way the classic essay used them, and that the definitions you're using are malleable enough that any argument about aesthetics could bend and twist them into synonyms for "stuff I like" and "stuff that annoys me."
Coherency cannot be measured, and it's extremely subjective. The *nix world can be extremely coherent: for example, you'll always find libtool, whether you want it or not, and your '70s-like filesystem layout. Isn't that "coherent" with its history?
This is not true even of Android and iOS, the modern wonderchildren he cheers on: a quick look at the Android filesystem hierarchy will show its adherence to outdated conventions (etc?) and incoherent repetition (sys? system?), and I bet you'd find something similar in iOS. So there might be a "coherent vision" behind, but the practice is quite incoherent; and mind, we're talking about very basic OSs that delegate any but the most basic functionality to third party apps. I'm happy to bet that in 10 years, the Android codebase will be as shitty and crufty as any Unix.
The truth is that a cathedral, in phkamp's rant, is "software that can all fit in one's head". For all operating systems, we're well past the stage where a single genius architect could envision the totality of a massive cathedral, for a number of reasons (time, working culture, legacy tech etc). Clearly phkamp, being more intelligent than most, only just passed that threshold with the very latest FreeBSD release, and felt the urge to tell the world.
"Software that can fit in one's head" and "coherent design vision" are closely related things. I'm in favor of both. If you're saying we're past the point where that's feasible in real-world systems, that's just assuming the conclusion – the wrong conclusion, in my view.
So, you are defining "cathedral" as "good"?
Microsoft didn't architect their software as software, they shaped it as tools of monopoly enforcement.
There are numerous accounts from people involved about how marketing decisions relating to cutting of 3rd parties or trying to bludgon somebody into submission caused them to cripple their own software and its architecture.
See the evidence in the M$/Novell case for ironclad evidence of this.
This kind of passive-aggressiveness only reinforces the grumpy-old-man stereotype and makes it really hard to take you seriously.
I do think that was a mistake. But MS didn't (doesn't) have the luxury that Apple did (and still does). MS can't just shut 30 years of compatibility off and expect to keep their corporate customers happy.
Oh and just for the record (since this thread seems to be grouped by age, I was born in 1963. I remember when MS was cool and "us" nerds were running from CA and IBM.
It's been twelve long years that OpenOffice moved from the cathedral model to the bazaar model. And it still suffers from the same criticisms you level against Office and more.
It is important to note the distinction between FreeBSD's package system and say Debian apt. In FreeBSD I can make from source some package, and in Debian I can apt-get install a package, because the Debian packages are prebuilt it just comes over in several chunks and doesn't need to build. (yes you can pull prebuilt packages for FreeBSD too). But my point is that the packaging system of FreeBSD, as used, conflates building the packages and using the packages.
So if I write a perl script that goes through the source code and changes all the calls to fstat() to match your configuration then to build I need perl even though you do not need perl. (as an example). But to run I don't care if you have perl or not.
But lets get back to the volunteer/paid thing again. People who volunteer rarely volunteer to clean the shit out of the horse stalls, no they volunteer to ride in the races and build a new horse. So you end up with a lot of stuff around you don't want.
Sadly for operating systems, and the original rant is really operating systems, there really isn't a cathedral/bazaar model, its more of a democracy / feudalism kind of thing. Nobody 'owns' the final user experience for the OS in FreeBSD/Linux discussions.
It's also a matter of what people are willing to pay for. Even (or perhaps especially) in commercially supported software, cleanups only happen if there is a strong business case for the cleanups. And if the cost of buying an extra build machine to run long complicated configure scripts is significantly less than the engineer time to re-engineer a new autoconf system from scratch, in many companies (and certainly most starupts) --- it won't happen.
The OP cares very much about code quality as a good and important thing in and of itself. But that view isn't shared by many business people, or by many programmers in general, for better or for worse. Some have argued that OSS code tends to actually be _better_ about code cleanliness, because it's public, and people do care about making sure that their code is clean. (I've never seen the proprietary source code for the Oracle DB, but there are many stories out there about how horrible it is from a code cleanliness perspective.)
Also, the OP seems to care a lot of lots of extra library dependencies. The big problem here is that a lot of people don't really care if their package uses perl or python as part of their build scripts/makefiles --- or even if their package uses perl _and_ python scripts. The OP cares, but for better or for worse, most people don't. And I would wager this is likely true at most companies where programmers are paid to maintain the source tree as well!
Yes. Much of his poor experience is due to build systems.
Anyone who's developed anything can tell you how fragile and finicky build systems can be.
Since build systems are used by developers, who can deal with complexity much better than "ordinary" users, they tend to be rough around the edges.
They're also unglamorous infrastructure, so volunteers tend not to spend time on them if they're not paid. And they have a tendency to break the entire application if broken, and are often part of the "interface" of the software (since every distro who builds package X from source uses build scripts that rely on for example ./configure with particular options), so maintainers tend to take an "if it's not broke, don't fix it" attitude.
GNU projects in particular -- whose autoconf the author complains of -- have quite a bit of built-up cruft since many of them are very old, and also vital infrastructure of most FOSS OS'es.
Of course, I'm biased, because I have such a degree. However, when I compare what I build to what is built by a business school grad who thinks anyone can learn computers, so everyone should be taught business, I really cringe at loss of formality in the industry.
Needing to build a computer processor from AND and OR gates really drives the concepts home. Building an operating system in C++ really drives the concepts home and creates a framework for thinking about computer based problem solving that's lost in many of the systems I look at today.
I really think it is a matter of exposure -- at one time all anyone ever saw were cathedrals, and so that was all anyone ever built. I feel now we've swung too far the other way, where there are many in the craft who've never seen a cathedral and have only known the bazaars. The reality is we need both, and the craft of building cathedrals is becoming endangered.
You are a terrible example because you come from a formal background of a hard science. Even worse, you come from one where proofs (the cathedral in the mathematics sense) are required or you aren't taken seriously.
The parent was talking about joe schmo off the street or the high school wunderkind who is building "twitter for teens" or "pinterest for social good". They are just slapping it together from the get go without thinking about any aspect of the design. Eventually they'll have to hire the formally schooled to come in and clean it all up, if they actually get anywhere with it.
Which makes me wonder, if more formally educated individuals were in the start up game would the failure rate be so high as it is now?
Uh, we are talking about the same Cathedral and Bazaar thing here, right? Cathedral in the sense that only the anointed master architects get to make decisions; everyone else does what they're told and keeps their mouths shut; bazaar in the sense that everybody brings whatever they've got and hopefully the good stuff hangs around?
Everyone brings whatever they've got, and the anointed master architects sift through it, determine the best course of action, redirect known-bad avenues of exploration and possibly anoint new master architects from the best of the candidates.
I asked a question, you are making a statement. If you have numbers that prove that the formally educated are failing more then I think we'd all love to see them. Otherwise, this is just conjecture.
As for 'science projects' failing it could be a number of things. I would assume that hard science startups fail a lot of the time because of cost. "Twitter for teens" is much cheaper to start than one doing alternative energy. Also, hard science is less sexy than a lot of the social stuff that currently dominates the landscape. Thusly, it can be much more difficult to find the money you need due to visibility. The social startups will probably pay out sooner than long-tail hard science companies.
This isn't to say that they don't exist, but there are a wealth of examples demonstrating the transition of major progress in those fields to companies like Google and Amazon. For anecdotal examples, see GFS, MapReduce, and Dynamo; there are many unpublished/trade-secret examples that will take years to come out.
Additionally, many schools have reduced their requirements for what exposure students should have to systems and OS:
A Harvard CS concentrator, for example, is not required to take either the introductory systems or operating systems classes; he/she may opt to take a high-level mobile programming class instead. Indeed, the operating systems and distributed systems courses are taught every other year, further reducing enrollment. The last time CS 161 (intro OS) was taught, there were only 23 students. 25 in CS 153 (Compilers). This is a stark contrast to the 100-200 students who took other upper-level CS classes (189 in mobile, for example).
While I will not contest the value of the formal CS education, I see some of the 'naive'(for lack of a better term) CS proliferating even there.
From googling around, might you be referring to:
"Systems Software Research Is Irrelevant"
And there's a little slashdot discussion:
I don't agree strictly with everything he says. It is a polemic rather than a careful argument. But still, he has a point and the data about OS research is strong (even beyond his data).
Of course, I may be biased because I've seen more poorly-built results from cathedral approaches.
My company ships quite a bit of FreeBSD (as pfSense). Could have gone linux, but linux is a mess (much worse than ports.)
I think OpenBSD is a mistake, at best it belongs as a group focused on security inside the netbsd project, but of course, Theo got kicked out of netbsd, thus: OpenBSD.
I'm also the guy who appointed Russ Nelson the "patron saint of bike shedding." Just FYI. :-) (None of Eric Raymond, Russ Nelson or Theo de Raat like me much.)
The first time I read of 'Cathedral & Bazaar' I thought ESR was illustrating and contrasting the BSD .vs linux development model. Only later did I understand that he was pointing fingers at GNU/FSF, not BSD.
There exist several larger OSS projects, such as Apache, Boost, the Kernel, etc, which accept contributions but are also curated. Thus they represent sort of hybrid between cathedral and bazaar. People who use these projects know they are getting some (varying) standard of quality.
I think these sorts of projects -- often shepherded by some kind of noncommercial Foundation or Organization -- are the best way to get a mix of openness and quality going forward.
The only example that jumps out at me is the original Unix, and I think you'll agree that comparing software from 30 years ago that does vastly less than ... pretty much anything out there these days is not an entirely fair, nor useful comparison.
Android is the most original-UNIX-like in it's development but I imagine you could extend it to systems in which large portions of source are available (since that is the kind of project C&B discussed) such as iOS and perhaps Java. Google Chrome. vBulletin. The Pine MUA. The ssh.com RFC 4253 implementation. Various parts of RHEL. And probably numerous "Free" programs that end up following the Cathedral model merely due to the culture of their maintainers.
We can think of many but the point is to make sure we're on the same page. It's not like all software projects in the world are neatly divided into "bazaar" and "cathedral". I don't know that Google Chrome is a "cathedral", for instance.
It's ridiculous that the author refuses to give a single example, instead opting to say basically "see! You people don't know what a cathedral is, just like I said."
I will cite as evidence two of my favorite programs ever:
WriteNow (for Macintosh) an excellent early word-processor written entirely in assembler. Unequaled for many years for its combination of stability, raw performance, and ease of use, ultimately it simply could not add new features (let alone make the jump to PowerPC) and died on the vine.
HyperCard, which was perhaps one of the most dazzling, innovative, and influential products ever to ship, which pretty much stopped evolving once its original programmer lost interest.
Today we have the phenomenon of the incredible version one product, usually developed by one person, which never really makes it to 2.0. These seem very much like cathedrals.
Most projects go nowhere, but once a project gains momentum it's going to split into pieces and each piece has a chance of "going bazaar" at each step.
Now, I've not been working in the software industry since before 1990, so I'm obviously not qualified to comment, but I'll say this: I can understand why a cathedral model might be warranted. Even using something like CMMI might be a good idea in some cases.
But for a lot of things, especially exploratory/experimental things, the bazaar model is really nice and can reap significant benefits. And the nice thing about the bazaar model is that if you want to follow a cathedral model, no one is stopping you! Go off and be your own little dictator with a "grand unifying vision". Come to think of it, that seems to be what many of the most successful open source projects are: one (or a few) people have a vision of an itch they want to scratch, and they pursue it with a bloody-minded persistence. The bazaar only comes in when someone forks or in the fact that anyone can compete or (try to) contribute.
And BTW, I do know who you are, and have a lot of respect for you, but in some ways this article (and your comments here) could be read as reactionary against the success of Linux and other more open OSS; the BSDs have always been more insular (or discriminating depending on your POV) and developed like cathedral models than Linux; oddly enough this has resulted in three distinct BSDs while there is still only one Linux kernel. I will agree that reading Brooks (and other computer history) is almost always a good idea; just MMM was enough to open my eyes to how little the industry has progressed (VirtualBox/VMWare? That's nice; IBM was designing full system emulators for hardware that didn't yet exist in the sixties).
I think people do want to make things better, but its happening much more decentralized, with small teams taking small safe steps on tools, libraries & frameworks.
Media, on-line banking, 3d printing, accessible hardware hacking, e-commerce sites, scientific/engineering software. The majority of these things are made by small teams plugging the best libraries and platforms out there together, where they already solves part of their software problem. Android, Cloud, Linux etc. It would be silly to say these were without ambition and don't contribute to the community.
(The average software devs Impact Factor may be diluted by the increasing number of people working with computers, but computers are so globally useful its inevitable. If good things are still getting made then who cares.)
In this comment I will equate "Rug Market" with "Ready, Fire, Aim" and "Cathedral" with "4 Year Plan".
Firstly, the "Rug Market" beats the "Cathedral" when you haven't formulated the problem properly, and so you have bad specs.
Secondly, the "Rug Market" beats the "Cathedral" when bad software is more profitable than good software. Google for "Worse Is Better" and "The Innovator's Dilemma".
Sad but true.