This varies by the company, but many companies are full of self-important people like this.
It's all well and good being self taught or university educated, but inevitably you need a mix of both disciplinary aspects to be able to function properly within a company.
I use the term morons advisedly, it is blunt and true, an accurate description of the "get it done" mentality of people with little experience and no formal training, who don't even know or acknowledge the depth of their ignorance and error.
The degree (or lack of) isn't the issue. Having a CS degree doesn't make you immune to any of your assertions. What you're describing is just a bad programmer.
I believe the GP's argument is about programmers with CS degrees who don't 'get' simplicity, because they can't validate their own intelligence and qualifications with it. I would never tar all programmers with the same brush, but much like the ignorant 'get it done' guy, there's the ignorant 'flex my brain muscles, fuck everyone else' guy too.
There are of course many ways to be a bad programmer. "Look how smart I am" is one kind, but I don't worry nearly as much about them because I have found that far more often I am dealing with the "get it done" kind of bad programmer.
Or to track how many people might self identify as a "get it done" developer who might be labelled as a "look how smart i am" developer (and vice-versa).
Your insinuation that doing something quickly means doing it badly is, in my opinion, unfounded, being productive and speedy is merely a by product of knowing what you are doing.
Of course, I've been that guy who has to clean up too. That job sucks :( almost never is the product designed the way it eventually ends up being used.
I guess if the codebase is a mess of spaghetti, that is another matter. I do always try to get the interfaces right from the beginning.
Everyone should know a few languages so they have the tools they need to do their job properly, would we use a chainsaw to screw something to a wall just because we hadn't learned how to use a screwdriver yet?
Frankly pointing to to being able to implement algorithms as a measure of a programmer's ability is next to useless. Any code monkey can ape a data structure they've seen in a book. Designing code so when requirements are changed you can easily swap that data structure with another is the real test.
ps I can implement any number is algorithms and design patterns, though I rarely get to.
On your PS of "I rarely get to", would you go further and say "I rarely need to" as well? Because that's how I feel for about 95% of the work I do. There's a few places in larger projects where I enjoy getting to test out a few different approaches to data structures in anticipation of potential use cases (speed, concurrency, reporting needs, etc). But after a while, I tend to know in advance what will work 'well enough' (or maybe even what the 'optimal' structures are) and don't need to (or get to) spend much time experimenting.
I know that what you really wanted to say is that people who lack motivation, passion and curiosity are worse programmers than those exhibiting these characteristics, but somehow you failed and instead wrote that being self-taught means that one cannot implement things as simple as binary search. And this is plain wrong, as is equating being self-taught with programming in high-level languages.
"tables are being depreciated - everything should be done with divs" was some mantra at this place for months, and there was no amount of logic or argumentation that would make anyone see sense. Even something like pointing out the W3C spec explicitly still had TABLE as a defined element... that was just false, because tables are "going away". This was... 8 years ago(?)... we still have tables. :/
Good times. :)
Not everyone works on facebook, nor is everyone writing real-time device drivers, nor is everyone writing something that will explode up in users tomorrow, devastating the business if not every customer is served in less than 50ms.
I may be one of those in the 'get it done' camp, but I've also learned over the past several years to opt for using well-known libraries when possible, to take advantage of the expertise and skills that I don't have (yet). 15 years ago, I was firmly in the 'write it from scratch and focus on performance, tightness, elegance, etc' camp, but not so much today. Part of that is because we simply didn't have the wealth of free software libraries we do today, so you had to write more stuff from scratch, but that's not as much of an excuse today.
I've yet to have to write a binary search after 17 years in professional software development, and my projects have not suffered because of it. I had probably a good 10 years of hobbyist time before that, so I probably have simply picked up 'good' patterns to common problems without necessarily knowing the specific CS theory or names behind them in some cases.
All that said, my answer to performance issues in my apps is generally not 'throw more hardware at it' - I will profile to look for bottlenecks, isolate specific areas and rewrite sections of code to make them more performant, which sometimes means changing how data is organized/stored, or modifying queries, or something else.
Lastly, while I think I understand the type of person/attitude you're talking about "full of self-important people like this", I don't think I see myself in that particular camp (but of course, no one ever does, right?). There's a degree of pragmatism that needs to happen in 'real world' software dev (isn't agile all about "you ain't gonna need it"?). I've also had to 'clean up' after enough other developers over the past 17 years - including myself on a number of occasions - that my perspective may be sufficiently different from the type of people that work in company X for 10 years and rise to the rank of 'sr dev' only ever having worked in one or two places. Or maybe I'm just a self-important ass who's justifying himself too much in public?
EDIT: One other thing that has popped in to my head - I've worked on more than couple projects where other people on the team (before me or concurrently) insisted on certain things being done "right", simply because that was the "right" way. Two things were apparent - they typically didn't know any other way at all (lack of experience) and they had no understanding of what the real use of the application was - never talked to end users or other depts/units, and were creating far more work for everyone else by not implementing things differently (in their minds, 'compromising on correctness'). Couple different scenarios in the past few years spring to mind.
Despite my highly developed pragmatism, I still think understanding those theories at great depth makes me a better programmer.
It's almost subconscious, but knowing how the compiler works, how the processor runs the compiled code, and how everything works at the most basic level gives me a more complete idea of what I'm doing under the hood.
I guess you could say that it's like having a physics degree would make you a better car mechanic, when the reality is probably far from it. But then, I know lots of car mechanics who would benefit from knowing a little more about the theory of their work rather than just practical know-how. It's a fine balance.
Just to respond to your last point: I've been one of those people insisting on doing something the "right" way. I disagree that this is a bad thing, or a sign of inexperience or lack of understanding. Often when I have to argue hard for taking this "right" path, it's because of experience with doing it the other way, and a belief that one would only do it the "wrong" way because of inexperience. So in other words, I have exactly the opposite view on that entire subject.
Often the "right" way to do things takes a little more time, a little more work, but is usually more rock solid in the long term or easy to understand for other programmers. It's more deterministic and predictable, and usually more about good architecture than raw performance (or it should be).
In general you have good practical points that every programmer should be aware of. By no means should real programming be done in an academic and theoretical manner. But at the same time, theoretical CS is provably correct about a vast number of concepts (that is it's business after all), and by no means should they be ignored or looked down upon. The truth lies in a balance between practicality and theory here, and one should never take a side. Look at each situation with the perspective it demands and solve it using the wide variety of tools available to you, and be prepared to accept that a solution might be more theoretical or more practical than you're willing or able to understand.
I find that having a background in the lower-level aspects of computing does help a lot. The situation where I've seen people who know mostly software falter is when debugging problems that involve system interactions. My belief is that being able to construct a mental model of what's happening from top to bottom is crucial in solving some problems and people who can't do it eventually resort to just trying random things until something works.
I guess the take-away from this conversation is that it's useful to have knowledge in another related domain, be it computer hardware, low-level programming, physics, computer science, etc., even when you are developing only high-level applications.
I do not at all 'look down' on CS stuff, but also don't feel that I need to be an expert on every single aspect of every single aspect of it to get things done. For example, my Linux desktop worked pretty well for my needs both before and after Linux kernel scheduling patches. When I read about them, I could understand what was going on, and if I was deep in the kernel, might even have been able to identify what/where to do. I appreciate and benefit from proper and efficient algorithms in libraries I use, and can (if need be) identify a particular library that's suited for a particular project based on the algorithms they use (and may even dip in the code to verify it's doing what I really need).
My own background is that I did 6502 machine code (by hand, no assembler) in the 80s (not professionally - I was a kid - did it as a hobbyist), so while I do not claim to be a computer scientist by any stretch, I have a decent conceptual idea of what's going on at the low-level. yes, most of that is out of date today, but I think conceptually I 'get' it. I also don't think it's actually helped that much in day to day work, but, perhaps it has and I just don't see it any more as it might be second nature(?)
Last point(?) - even if what you're doing is technically/algorithmically the most optimal solution to something, please document what you're doing. Even something as basic as identifying "hey, using a bloom filter here because XYZ needed ABC" in some comments will help people coming along who aren't familiar with particular patterns to get up to speed.
A graphical toolkit system that stored all the styles applied to components as a linked-list, but each element included a pointer to the the head of the linked list and everytime a new component was instantiated it was added to the head of the linked list. Which meant that creating a new graphical element went from O(1) to O(N) and caused significant slow down (i.e seconds of time).
A well known open-source xml parsing library which stored the attributes of an xml element in a linked-list, as part of it's xml validation it had to ensure attribute uniqueness and do that it had to iterate through every attribute. This means that to insert N attributes would take O(N^2) - again enough to cause a significant performance degradation.
Iteration through hashtables.
Writing software is complicated and it's difficult to remember exactly how every part of a large system is working. Sometimes people forget an important detail and end up write code that performs badly. It's hardly limited to people who don't understand CS theory.
I think these kinds of examples are very anecdotal.
I think this is an artifact on many of them using high level languages where linked lists are rarely used.
That's no excuse though
Also, even if you never thought of efficiency, or algorithms. If someone describes binary search to you and you cannot implement it, you are still learning to code. If someone asks you to write an efficient search algorithm, I can see those types of people missing it (especially when you consider the fact that for the general case you also need an nlog(n) sorting algorithm.
If they actually refuse to go back and implement a better algorithm once the characteristics of the application are known, that is another matter. Do these people actually hold the same beliefs once they start to see the performance of their application rapidly degrade?
>> 1. Complexity from Aggregation
Don't conflate "self-taught" with ignorance. I'm 85% self-taught (I was a math major in college, and I took a few CS courses but not enough to become a serious coder) but I've picked up a lot of that stuff, on account of curiosity, later on. Whether someone learned how to program in school or in the trenches matters a lot less than whether they had the curiosity to actually learn it, whether than a "get 'er done" attitude that leads to no real knowledge.
There are a lot of people who pursued CS majors and evidently did well enough to get good jobs but, when they got out into the real world, turned out a bunch of VisitorFactory enterprise crap. Methinks they should have drunk less in college.
Functional programming isn't always the solution, but it's amazing how often it is the right way of doing things. I just wrote some neural network code, because I'm developing AI for a card game, and I used mutable state because back-propogation feels "inherently mutable" and because mutable Arrays are fast (whereas Scala's Vector might not be). I regret doing so. If nothing else, I should have started with the immutable solution and only moved to the mutable one if there was a measured performance benefit. So many bugs are invited in when you start using mutable state.
My problem with Java-the-culture is that it seems to come from a hatred of mathematics and the reality of what programming actually is: solving computational problems, and using abstractions when they simplify solutions. Large-project methodologies with the weird design patterns seem tailored to make the programming process easier for people who failed Calc 2 to manage. To make it something that non-technical higher-ups can understand and commoditize: "We have 38 Java developers, 302 kLoC, 1973 Factories, and 714 Visitors, and we're going to double all these numbers in the next 6 months." (Never mind whether any actual problems are being solved.)
2) Software quality is inversely proportional to the number of people working on it.
3) Teaching someone with maths how to code is a lot easier than teaching a coder maths.
Suprisingly, I still deal with many developers that write procedural code with if/else statements, up to 4 or more levels deep, even after years of being developers.
I'm not against functional languages/approaches. I just think that everything has its place, and that procedural code and nested if/then/elses are appropriate in many circumstances.
If you need more than 3 levels of indentation, you're screwed anyway,
and should fix your program.
As for being far from the wort offense, well, I try to set my goals a little higher than "not the worst".
As far as I know, the only worse offenses to readability are jumps or formatting your program like an IOCCC entry.
Do you actually disagree, or are you just going on? If the former, then please give an example of the right "circumstances."
Here is an idea for CSCI 344: Identifying When to Throw that Shit Out
"Things You Should Never Do, Part I"
Seems like Joel should retract that article.
5. People Interaction
Yup. People skills matter. Programming seems like a great job for asocial people. It's actually the shittiest job imaginable if you have weak people skills, because you get staffed with the crappy maintenance projects that have no upside. (Programming probably has the most variance of any job category; the best projects are a lot of fun, and the worst grind your mind to sawdust and produce nothing.) If no one likes you, no one trusts you, and you'll never get projects where your technical intelligence really matters.
I understand the hate that technical people have for "office politics" but the only approach that works is to adapt. To figure that shit out so you can laugh (silently) at the suckers who don't get it instead of being one of them.
4. Writing Matters
Yes, yes, yes. It really does. My personal opinion is that you're no better a programmer than you are a teacher. We're drowning in technical assets. Most assets are of abysmal quality because the people generating them never took the time to communicate what was being done, how and why, but too little code is not a problem for the software world. We have too much. Too much code and far too much unmanaged complexity.
People who are unwilling or incapable of teaching others how to use the technical assets they've created should not be programmers IMO. That's your job. To solve problems and to teach people how to use those solutions.
3. Software is Never Done
Further reading: http://michaelochurch.wordpress.com/2012/04/13/java-shop-pol...
Sadly, this is more a consequence of big-program enterprise methodologies. In small-program shops, software can be "done". It can actually be finished. The overarching project continues to expand, but this shouldn't be accomplished by shoving more kLoC into a working program and diverting it away from the original vision.
Large, ambitious projects should be structured as systems and given the respect that a system deserves (such as attention to fault tolerance and communication protocol). There's no excuse for most of these large single-program monoliths that ultimately (Conway's Law) become artifacts of parochial corporate politics rather than elegant, minimalistically simple solutions to mathematical and computational problems.
I'm a fan of the Unix philosophy and small-program methodology. Solve a problem. Do it well. Move on. Come back to it if you need to solve another problem. Not unless. I don't like the big-program methodology, under the guise of "object-oriented programming", that seems to have won in the contemporary Java-shop culture.
2. Few Clever Algorithms
Sad but true. The managers' job is to take cleverness out of our jobs, not because they're assholes or don't trust us (usually not the case) but because if our work relied on our cleverness or creativity, then they wouldn't be doing their jobs, which is to deliver business value reliably. We tend to like the high-expectancy, high-variance, fun and creative work, but there isn't much tolerance for this in the industrial world, which would rather produce the low-expectancy "sure thing". The problem is that Big Software actually isn't "reliable"; it's just that the legacy costs are paid out later after all the decision-makers have had a couple of promotions and are far away from the wreckage.
1. Complexity from Aggregation
Yes. This is exactly why I hate big-program methodologies. They inject so much unplanned, unexpected complexity into everything. People should aim for 500 lines of really clean, usable code instead of 20,000 lines of garbage that barely solves the problem and for which no one really knows what it does.
Great point. Or as Daniel Geer et al put it (discussing security, but applicable in general): "The central enemy of reliability is complexity. Complex systems tend to not be entirely understood by anyone. If no one can understand more than a fraction of a complex system, then, no one can predict all the ways that system could be compromised by an attacker. Prevention of insecure operating modes in complex systems is difficult to do well and impossible to do cheaply: The defender has to counter all possible attacks; the attacker only has to find one unblocked means of attack. As complexity grows, it becomes ever more natural to simply assert that a system or a product is secure as it becomes less and less possible to actually provide security in the face of complexity."
Managers clamp down on individual creativity and cleverness because they fear complexity, and rightly so, because 95% of software complexity serves no value, and only creates frustration and risk. Their risk-limiting optimization tends to suck all the fun and creativity out of the job, although that's not their intent.
Here's the problem: complexity emerges anyway (Greenspun's Tenth Rule). Force people to use Java instead of high-level languages and they'll invent AbstractFactory patterns and hideous, undocumented DSLs in the name of "object-oriented programming". The problem with software is the same problem that exists in legislation: laws are never unmade. The difference is that no one needs to know or care about horse-carriage requirements from 1730 in Philadelphia, but the legacy complexity in software lives on, making everything unreadable and messy.
Moreover, the end-result of all this unexpected complexity is that most software jobs become a legacy slog, which further reduces the room for creative expression.
This is why the best programmers are in startups -- I do not believe you can be good at people and computers at the same time. And it is much more profitable to be involved in people, sadly.
Could this also be one of the _biggest_ reasons why startup fails? Lack of people skill and focusing on technical problems for the sake of technical? Not to mention lack of communication to know the intend of the code seems to be the biggest time sink on programmer's productivity.
Of course on the flip side you can always hire the smarter and smarter developers than the previous guy since you kind of need them to navigate the previous legacy code base. This seems to be the pattern in our industry.
Oh, come on. Show your work.
For that matter, define "best." 
 My definition includes people skills.
You can become adequate, and you should. Yes, the monkey-bots running million-year-old legacy code are a pain in the ass (it's annoying being one, too) but you can learn how to deal with them, and you must.
People skills matter more when you're in startups. The business side is all pitch, pitch, pitch, and from a disadvantaged position. If you're technical, you need the people skills to size up a business co-founder, or else you'll end up with a dud, and you need to be good enough with people that he trusts you and takes you seriously.