Originally, it looked like this:
* MIT philosophy: never compromise on correctness, even in bizarre corner cases. Aim for conceptual beauty to a programmer.
* New Jersey (Bell Labs) philosophy: comprise on correctness for simplicity or performance.
In 1985, six years before "Worse Is Better" was originally written, the "New Jersey" attitude was probably more useful. Most people, if they wanted to write acceptably performant software, had to do it in assembly. C was less of a leap, conceptually and in terms of average-case performance, than Lisp was. People who'd been writing assembly programs for years could learn how to write performant C. Writing performant Lisp would be much harder. A contemporary Common Lisp executable is at least 40 MB (obviously, that wasn't the case in the 1980s); at that time, 1 MB was a powerful machine. "Worse is better" worked in the 1970s and '80s. If every piece of computing work had to be perfect before it could be shipped, we'd be far behind where we are.
Also, quite a number of the original Unix programs were for natural-language processing (at a level that'd be primitive today) and paper formatting. With the resources of the time, it would've been impossible to get much of that stuff perfectly right anyway.
Bell Labs wasn't full of the anti-intellectual idiots who invoke worse-is-better, lean-startup tripe today. They knew what they were doing. They knew the compromises they were working under. They bet on Unix and C rather than Lisp machines, and they were right. In 2014, thanks to our ability to stand on the shoulders of giants that were built using C, we have machines that can efficiently run code in pretty much any language, so the C programmers and the Lispers have won. At least, on that front.
However, the "worse is better" lesson doesn't apply nearly as well in 2014. We can do about 500,000 times as much computation per dollar as we could in 1991. That's 500,000 times (at least!) as many opportunities for things to go wrong. A bug that happens once per 100 billion operations used to be negligible and now it's often not.
Unfortunately, we have an industry beset by mediocrity, in which commodity developers are managed by commodity business executives to work on boring problems, and low software quality is simply tolerated as something we'll always have to deal with. Instead of the knowing compromise of Bell Labs, "worse is better" has evolved into the slipshod fatalism of business people who just assume that software will be buggy, ugly, hard to use, and usually discarded after about 5 years. Yet we're now in a time where, for most problems, we can affordably do them correctly and, because things happen so much faster now, we often put ourselves and our businesses at serious risk if we don't.
What was good enough in the 80s is not good enough in 2014. Our views about good enough changed. For example, security requirements are much higher today.
The question is not if Better or Worse is better. The question is what is good enough?
Its not clear to me that this is the case anymore. The whole startup thesis is that a small, excellent team can outperform vastly larger teams of average people. This is the opposite of how it was in the 80s.