Hacker News new | past | comments | ask | show | jobs | submit login
Top Surprises When Starting Out as a Software Developer (henrikwarne.com)
116 points by henrik_w on Aug 26, 2012 | hide | past | web | favorite | 76 comments



My biggest surprise - many developers, even in senior positions, are self-taught and cannot properly implement even basic stuff like binary search, don't understand basic performance considerations around algorithm complexity. Their mindset is "we are using high-level languages; specific algorithms and performance considerations are properly addressed for us by the language designers, we are doing real stuff and not some theoretical CS BS".

This varies by the company, but many companies are full of self-important people like this.


Your implying that self taught developers are unable to comprehend basic parts of computer science is really an over generalization, I and many of my co-workers are self taught and we often find that graduates, whilst being intelligent and understanding the more esoteric areas of the profession, are unable to understand certain ideas as 'time is money, don't do it cool when a simpler way will work just as well' and are also incapable of understanding how to monetize aspects of a code base.

It's all well and good being self taught or university educated, but inevitably you need a mix of both disciplinary aspects to be able to function properly within a company.


Yes, it's exactly that attitude he is arguing against. Then it falls to people who know what they're doing to clean up after the "just get it done" morons, when it turns out just getting it done means getting it done wrong and badly and slow. Management thinks "oh, it's already done, speeding it up should be an easy job, what's wrong with our new maintenance guys that they take so long to just fix it up?"

I use the term morons advisedly, it is blunt and true, an accurate description of the "get it done" mentality of people with little experience and no formal training, who don't even know or acknowledge the depth of their ignorance and error.


I didn't read it that way. I also find it ironic that in arguing against self-important people, there is a hell of a lot of self-importance inherent in the post (and yours, too).

The degree (or lack of) isn't the issue. Having a CS degree doesn't make you immune to any of your assertions. What you're describing is just a bad programmer.

I believe the GP's argument is about programmers with CS degrees who don't 'get' simplicity, because they can't validate their own intelligence and qualifications with it. I would never tar all programmers with the same brush, but much like the ignorant 'get it done' guy, there's the ignorant 'flex my brain muscles, fuck everyone else' guy too.


Of course I am describing a bad programmer. Your point is well taken, many programmers with CS degrees are terrible, many programmers without CS degrees are terrible, and there are some good among both.

There are of course many ways to be a bad programmer. "Look how smart I am" is one kind, but I don't worry nearly as much about them because I have found that far more often I am dealing with the "get it done" kind of bad programmer.


It'd be interesting to trace at what point the "look how smart i am" developer turns in to a "get it done" developer, vs drops out of the field.

Or to track how many people might self identify as a "get it done" developer who might be labelled as a "look how smart i am" developer (and vice-versa).


I'm not saying to just 'get it done' because that is really misinterpreting what I said before, I'm saying that the low level, time intensive method is not always the best, especially when you need to push a project soon and time is really running out.

Your insinuation that doing something quickly means doing it badly is, in my opinion, unfounded, being productive and speedy is merely a by product of knowing what you are doing.


Fair enough. "Get it done" is a trigger phrase for me, so I responded in too personal an attack. There are many "get it done" morons out there, and I have worked with several, but that of course does not mean you are one. You recognize the inherent trade-offs involved, which I appreciate.


Surprise, both of you guys are needed! The "get it done" guy, is almost never the "do it right" guy. Some are closer to each other then others... but almost never are they the same guy. The thing is, when you're building that first prototype, the most important thing is getting it out there and proving the concept works! It doesn't matter if you build the perfect architecture that can scale effortlessly if in the end no one decides your product has value.

Of course, I've been that guy who has to clean up too. That job sucks :( almost never is the product designed the way it eventually ends up being used.


I have also been on the side of finishing a new product quickly and imperfectly, and I acknowledge the trade-offs involved. I rail against the "get it done" mind-set in quotes. I find that a programmer who prides themselves on getting it done is, more often than not, a "get it done" moron who means get it done badly. The kind of programmer who thinks "getting it done" means stabbing the code enough times that the bloody wounds congeal just enough to compile. They are not smart or experienced enough to know what they don't know.


I subscribe to "get it done" programming. I've never had the troubles you seem to speak of going back to fix the shortcuts though. Interfaces are, for all intents and purposes, black boxes, so your improvements can always be dropped right in with relatively little effort.

I guess if the codebase is a mess of spaghetti, that is another matter. I do always try to get the interfaces right from the beginning.


I am not implying in any way that self-taught developers are unable of anything. I know a lot of great self-taught developers. However, many developers without formal CS training learn one or two technologies (e.g. ruby or another web framework) and believe this is enough; they even call themselves "hackers". That last approach surprised me.


I think the same applies for everyone, you have to expand your horizons unless the company you work for is extremely locked down on technologies.

Everyone should know a few languages so they have the tools they need to do their job properly, would we use a chainsaw to screw something to a wall just because we hadn't learned how to use a screwdriver yet?


Self taught programmer in a senior position here.

Frankly pointing to to being able to implement algorithms as a measure of a programmer's ability is next to useless. Any code monkey can ape a data structure they've seen in a book. Designing code so when requirements are changed you can easily swap that data structure with another is the real test.

ps I can implement any number is algorithms and design patterns, though I rarely get to.


I posted a much longer reply just now, but you get to the heart of it.

On your PS of "I rarely get to", would you go further and say "I rarely need to" as well? Because that's how I feel for about 95% of the work I do. There's a few places in larger projects where I enjoy getting to test out a few different approaches to data structures in anticipation of potential use cases (speed, concurrency, reporting needs, etc). But after a while, I tend to know in advance what will work 'well enough' (or maybe even what the 'optimal' structures are) and don't need to (or get to) spend much time experimenting.


I cringe in pain every time I read such posts. I'm almost entirely self-taught, but I was teaching myself for nearly twenty years now and am still learning, even faster now than ever before. And I will learn for the rest of my life. Hell, I'm writing a compiler now in my free time for fun (not that it's particularly difficult), I already did almost everything that average CS curriculum covers. Stop, please, assuming that being self-taught is in any way worse than being taught by someone else. Maybe it took me more time to comprehend certain concepts, but my understanding of them is just as good, if not better, than most of the graduates.

I know that what you really wanted to say is that people who lack motivation, passion and curiosity are worse programmers than those exhibiting these characteristics, but somehow you failed and instead wrote that being self-taught means that one cannot implement things as simple as binary search. And this is plain wrong, as is equating being self-taught with programming in high-level languages.


For me, it was the opposite of ignoring performance: the developers of the platform we work on have decided to use floats for accounting, because decimals were "slow".


Relatedly (yet unrelatedly) I worked at a place where the IT team were given a bunch of excel docs to post to the website. They spent weeks taking all the data and created nested div/span sets with hand styling on each cell (because certain IE versions would deal with the styles right unless inline, IIRC), because tables were "depreciated" (sic). the end result looked just like HTML tables, except they were about 5 times as large in HTML, and not anywhere easily as manageable nor usable by screen readers.

"tables are being depreciated - everything should be done with divs" was some mantra at this place for months, and there was no amount of logic or argumentation that would make anyone see sense. Even something like pointing out the W3C spec explicitly still had TABLE as a defined element... that was just false, because tables are "going away". This was... 8 years ago(?)... we still have tables. :/

Good times. :)


Yep, it is tables used as layout that is depreciated.


Well, I'd say not even that. When you're laying out a spreadsheet, just use tables then style them as appropriate. My recommendations were summarily passed over, and I don't work there any more. :)


I'd rather only have to deal with these 'high level only' folks vs the ones that literally could not code up a fizz buzz solution to save themselves. My biggest surprise is that people like this really do exist in the industry.


I fall in to that camp. I don't look down on 'theoretical CS' stuff, but it's rarely ever even had to be a consideration in projects I've worked on, which has included ecommerce systems selling billions of dollars of stuff (large qtys, small price per item), real time reporting of financial data, and numerous other projects requiring a degree of scale or speed or both (php, vb, java and other stuff over the years).

Not everyone works on facebook, nor is everyone writing real-time device drivers, nor is everyone writing something that will explode up in users tomorrow, devastating the business if not every customer is served in less than 50ms.

I may be one of those in the 'get it done' camp, but I've also learned over the past several years to opt for using well-known libraries when possible, to take advantage of the expertise and skills that I don't have (yet). 15 years ago, I was firmly in the 'write it from scratch and focus on performance, tightness, elegance, etc' camp, but not so much today. Part of that is because we simply didn't have the wealth of free software libraries we do today, so you had to write more stuff from scratch, but that's not as much of an excuse today.

I've yet to have to write a binary search after 17 years in professional software development, and my projects have not suffered because of it. I had probably a good 10 years of hobbyist time before that, so I probably have simply picked up 'good' patterns to common problems without necessarily knowing the specific CS theory or names behind them in some cases.

All that said, my answer to performance issues in my apps is generally not 'throw more hardware at it' - I will profile to look for bottlenecks, isolate specific areas and rewrite sections of code to make them more performant, which sometimes means changing how data is organized/stored, or modifying queries, or something else.

Lastly, while I think I understand the type of person/attitude you're talking about "full of self-important people like this", I don't think I see myself in that particular camp (but of course, no one ever does, right?). There's a degree of pragmatism that needs to happen in 'real world' software dev (isn't agile all about "you ain't gonna need it"?). I've also had to 'clean up' after enough other developers over the past 17 years - including myself on a number of occasions - that my perspective may be sufficiently different from the type of people that work in company X for 10 years and rise to the rank of 'sr dev' only ever having worked in one or two places. Or maybe I'm just a self-important ass who's justifying himself too much in public?

EDIT: One other thing that has popped in to my head - I've worked on more than couple projects where other people on the team (before me or concurrently) insisted on certain things being done "right", simply because that was the "right" way. Two things were apparent - they typically didn't know any other way at all (lack of experience) and they had no understanding of what the real use of the application was - never talked to end users or other depts/units, and were creating far more work for everyone else by not implementing things differently (in their minds, 'compromising on correctness'). Couple different scenarios in the past few years spring to mind.


Despite being someone with a formal CS background, I completely understand where you're coming from. I've worked on practical applications for the last 5 years with very little consideration for the details of algorithms or the underlying theories behind what I'm doing.

Despite my highly developed pragmatism, I still think understanding those theories at great depth makes me a better programmer.

It's almost subconscious, but knowing how the compiler works, how the processor runs the compiled code, and how everything works at the most basic level gives me a more complete idea of what I'm doing under the hood.

I guess you could say that it's like having a physics degree would make you a better car mechanic, when the reality is probably far from it. But then, I know lots of car mechanics who would benefit from knowing a little more about the theory of their work rather than just practical know-how. It's a fine balance.

Just to respond to your last point: I've been one of those people insisting on doing something the "right" way. I disagree that this is a bad thing, or a sign of inexperience or lack of understanding. Often when I have to argue hard for taking this "right" path, it's because of experience with doing it the other way, and a belief that one would only do it the "wrong" way because of inexperience. So in other words, I have exactly the opposite view on that entire subject.

Often the "right" way to do things takes a little more time, a little more work, but is usually more rock solid in the long term or easy to understand for other programmers. It's more deterministic and predictable, and usually more about good architecture than raw performance (or it should be).

In general you have good practical points that every programmer should be aware of. By no means should real programming be done in an academic and theoretical manner. But at the same time, theoretical CS is provably correct about a vast number of concepts (that is it's business after all), and by no means should they be ignored or looked down upon. The truth lies in a balance between practicality and theory here, and one should never take a side. Look at each situation with the perspective it demands and solve it using the wide variety of tools available to you, and be prepared to accept that a solution might be more theoretical or more practical than you're willing or able to understand.


I don't come from a CS background but rather my education was in computer engineering, mostly of the hardware and low-level systems programming variety. The majority of what I actually do for work is entirely self-taught.

I find that having a background in the lower-level aspects of computing does help a lot. The situation where I've seen people who know mostly software falter is when debugging problems that involve system interactions. My belief is that being able to construct a mental model of what's happening from top to bottom is crucial in solving some problems and people who can't do it eventually resort to just trying random things until something works.

I guess the take-away from this conversation is that it's useful to have knowledge in another related domain, be it computer hardware, low-level programming, physics, computer science, etc., even when you are developing only high-level applications.


I struggle between wanting to clarify my 'example' situations more, and not wanting to 'name names' (or implicate people enough based on identifiable information). In two recent cases, scenarios that were being argued for as "right" (as in "there is only one right way, because I have a CS degree") were in actuality subpar options where there was no definable 'right way' (think arguments for which order invoicing/accounting software should process debits/credits in). To some extent there are 'right' ways in some fields, but the real right answer is to work with the business units in question, and determine both what they need now, how to adjust them to another process if there's a compelling reason to change, and document all of it for posterity. In contrast, I was dealing with something built in isolation, undocumented, and justified with hand-wavy CS pseudo-BS. This isn't an argument against CS theory at all, but as much as people have run in to justifications for bad code with "well, it worked and we had a deadline", I've also run in to quite a number of piss poor designs that didn't really work (or required massive work arounds) because someone with a CS degree was given free reign without challenge, and who never understood the pragmatic aspects of the business needs they were purporting to serve. In more other words, I suspect we would probably have the exact same view on that subject (if I'd shared explicit details).

I do not at all 'look down' on CS stuff, but also don't feel that I need to be an expert on every single aspect of every single aspect of it to get things done. For example, my Linux desktop worked pretty well for my needs both before and after Linux kernel scheduling patches. When I read about them, I could understand what was going on, and if I was deep in the kernel, might even have been able to identify what/where to do. I appreciate and benefit from proper and efficient algorithms in libraries I use, and can (if need be) identify a particular library that's suited for a particular project based on the algorithms they use (and may even dip in the code to verify it's doing what I really need).

My own background is that I did 6502 machine code (by hand, no assembler) in the 80s (not professionally - I was a kid - did it as a hobbyist), so while I do not claim to be a computer scientist by any stretch, I have a decent conceptual idea of what's going on at the low-level. yes, most of that is out of date today, but I think conceptually I 'get' it. I also don't think it's actually helped that much in day to day work, but, perhaps it has and I just don't see it any more as it might be second nature(?)

Last point(?) - even if what you're doing is technically/algorithmically the most optimal solution to something, please document what you're doing. Even something as basic as identifying "hey, using a bloom filter here because XYZ needed ABC" in some comments will help people coming along who aren't familiar with particular patterns to get up to speed.


Some real world examples I've seen where people have screwed up because of not understanding theory:

A graphical toolkit system that stored all the styles applied to components as a linked-list, but each element included a pointer to the the head of the linked list and everytime a new component was instantiated it was added to the head of the linked list. Which meant that creating a new graphical element went from O(1) to O(N) and caused significant slow down (i.e seconds of time).

A well known open-source xml parsing library which stored the attributes of an xml element in a linked-list, as part of it's xml validation it had to ensure attribute uniqueness and do that it had to iterate through every attribute. This means that to insert N attributes would take O(N^2) - again enough to cause a significant performance degradation.

Iteration through hashtables.


I've seen cases where people who with degrees in CS have screwed up in equally bad or even worse ways.

Writing software is complicated and it's difficult to remember exactly how every part of a large system is working. Sometimes people forget an important detail and end up write code that performs badly. It's hardly limited to people who don't understand CS theory.


As a selt-taught developer, I know what Big O notation is and I've worked with many CS grads (including Stanford) who did not take into consideration algorithmic efficiency.

I think these kinds of examples are very anecdotal.


The bigger point is that it's almost all anecdotal - the number of people in this field is too broad and varied to be able to draw any substantial conclusions about anything, imo.


Which graphical toolkit system and xml parsing library are you referring to?


that can also be attributed to not caring. Do CS degrees teach compassion?


Was that because the author had no comprehension of complexity and couldn't understand what was wrong with the design, or because of a lazy oversight that was fixable once the problem was brought to light?


I suspect the "self-important people like this" that trekkin is talking about are those who tell IT to "copy every program (and call) in the codebase, add 'Acme' to the front of all the names, and overwrite a clone of the present general data with Acme's data. Why can't this simple task be done by the end of the week, so I can 'deliver a business solution' to Acme next week?". This is just an extreme case I've seen of how the legacy code in many corporate IT depts consists of thousands of copies of the same basic code patterns.


Yep. I've been out of school since 1991 and the last time I wrote a binary search, or balanced a tree, or even implemented a linked list, was in school.


For christ's sake. I've met far too many people with Uni degrees who don't know what a B-Tree is, or even what a linked list is (truly scary).


yeah but they could rewrite std::stack... um, without templates.


If they did indeed do that without understanding how a linked list works, then they're not going to do very well at it.


Surprisingly most software devs I've met understand arrays and hashes, but few know what a linked list is.

I think this is an artifact on many of them using high level languages where linked lists are rarely used.

That's no excuse though


You are the one suffering with self-importance by virtue of your disregard for the opinions of experienced engineers.


This is actually suprising to me. I started out as a self taught programmer in middle-school. Very early on I ran into performance problems, and had to spend a fair amount of time making code efficient. I did, and still do, only explicitly try to make code efficient after I see it is a performance issue, but I wonder how anyone can do a large amount of programming without running into them.

Also, even if you never thought of efficiency, or algorithms. If someone describes binary search to you and you cannot implement it, you are still learning to code. If someone asks you to write an efficient search algorithm, I can see those types of people missing it (especially when you consider the fact that for the general case you also need an nlog(n) sorting algorithm.


I understand where your coworkers are coming from. Getting the application up and running to prove the concepts the application intends to solve is critical in a business setting. If you can implement search by hitting every element in the array in a fraction of the time it would take to implement a binary search, you are that much further along to seeing the application in action.

If they actually refuse to go back and implement a better algorithm once the characteristics of the application are known, that is another matter. Do these people actually hold the same beliefs once they start to see the performance of their application rapidly degrade?


>> 2. Few Clever Algorithms

>> 1. Complexity from Aggregation


My biggest surprise - many developers, even in senior positions, are self-taught and cannot properly implement even basic stuff like binary search, don't understand basic performance considerations around algorithm complexity.

Don't conflate "self-taught" with ignorance. I'm 85% self-taught (I was a math major in college, and I took a few CS courses but not enough to become a serious coder) but I've picked up a lot of that stuff, on account of curiosity, later on. Whether someone learned how to program in school or in the trenches matters a lot less than whether they had the curiosity to actually learn it, whether than a "get 'er done" attitude that leads to no real knowledge.

There are a lot of people who pursued CS majors and evidently did well enough to get good jobs but, when they got out into the real world, turned out a bunch of VisitorFactory enterprise crap. Methinks they should have drunk less in college.


You have to be careful, many people who laugh at design patterns turn out crap spaghetti code. At least with design patterns, you are at least attempting to separate traversal from operations on the elements as with visitors. People who don't like them often turn out huge monolithic algorithms which is an even worse crime.


Functional programming has "design patterns" but few of them, and those design patterns make so much sense that you stop thinking of them as such and just think of them as the way to solve problems. Referentially transparent functions and immutable records for data are design patterns in the non-pejorative sense, but they simplify code rather than complicating it. These are the two design patterns of functional programming: Noun and Verb. For mutable state (monad vs. ref cells vs. message passing) and for the Adjective problem (type classes vs. inheritance vs. functors) there is less of a consensus.

Functional programming isn't always the solution, but it's amazing how often it is the right way of doing things. I just wrote some neural network code, because I'm developing AI for a card game, and I used mutable state because back-propogation feels "inherently mutable" and because mutable Arrays are fast (whereas Scala's Vector might not be). I regret doing so. If nothing else, I should have started with the immutable solution and only moved to the mutable one if there was a measured performance benefit. So many bugs are invited in when you start using mutable state.

My problem with Java-the-culture is that it seems to come from a hatred of mathematics and the reality of what programming actually is: solving computational problems, and using abstractions when they simplify solutions. Large-project methodologies with the weird design patterns seem tailored to make the programming process easier for people who failed Calc 2 to manage. To make it something that non-technical higher-ups can understand and commoditize: "We have 38 Java developers, 302 kLoC, 1973 Factories, and 714 Visitors, and we're going to double all these numbers in the next 6 months." (Never mind whether any actual problems are being solved.)


I didn't even mention functional programming. Functional Programming is not always an option.


1) The aggregation of marginal gains - most enterprise software is crap, most enterprise teams are lazy - if you consistently and repeatedly improve the worst aspects of your app you quickly end up with something that outperforms competitor software developed with 8-figure budgets.

2) Software quality is inversely proportional to the number of people working on it.

3) Teaching someone with maths how to code is a lot easier than teaching a coder maths.


About #3 - source?


I think it should go without saying, but there is one more point I'd add to this list: Learning never stops - don't be ignorant.

Suprisingly, I still deal with many developers that write procedural code with if/else statements, up to 4 or more levels deep, even after years of being developers.


So you suggest that instead of several if/then/elses a clever recursive function call should be used, right? And the function should take its arguments by value, and have zero side-effects?


Why would you need a recursive function? It's just a matter of decomposing your functions into smaller parts.


I think trekkin was being snarky about people preaching Functional techniques. (I don't think said snark was warranted, for the record.)


:)

I'm not against functional languages/approaches. I just think that everything has its place, and that procedural code and nested if/then/elses are appropriate in many circumstances.


When exactly are four levels or more of nesting appropriate? This has nothing to do with functional programming, it's a matter of readability. I mean, Linux is a completely procedural codebase (with some manual OO), but even its style guide says

    If you need more than 3 levels of indentation, you're screwed anyway,
    and should fix your program.


Why four, or three. Why not one or five... what a silly question/statement. It's completely subjective and as far as "a matter of readability" it's far far from the worst offense.


It's not completely subjective. McConnell in Code Complete mentions studies by Chomsky and others that suggest that few people can understand more than three levels of nested ifs.

As for being far from the wort offense, well, I try to set my goals a little higher than "not the worst".


No, it's not that far from the worst offense. The hardest code I have ever had to read has always been code with ten+ levels of nesting in a big function, where I had to keep a big mental stack of what was necessarily true to be in the branch I was looking at. There is no situation on Earth where four levels of nesting couldn't be broken up into something more understandable; the only reason not to might be for the purposes of microoptimization.

As far as I know, the only worse offenses to readability are jumps or formatting your program like an IOCCC entry.

Do you actually disagree, or are you just going on? If the former, then please give an example of the right "circumstances."


I would suggest you see some of Uncle Bob's talks/videos (starting with http://www.cleancoders.com/)


My biggest surprise - The value of modularity is almost always ignored and that ivy-league CS grads can be horrible programmers.


Maybe not a surprise, but definitely unexpected, was the number of systems where, over time, the incremental features had overtaken the original architecture, but the vendor would not fund a new implementation. I've seen 15 year-old systems that were so byzantine, every feature change broke two others, and attempts at improvement caused entropy. It was both a huge time sink and customer satisfaction debacle.

Here is an idea for CSCI 344: Identifying When to Throw that Shit Out


As the link by lttlrck also advocates: throwing shit out can easily be a mistake. More usually, http://www.amazon.com/Working-Effectively-Legacy-Michael-Fea... + http://www.amazon.com/Refactoring-Improving-Design-Existing-... can get you further, faster. Stuff keeps working while you incrementally improve it.


Maybe they read this:

"Things You Should Never Do, Part I"

http://www.joelonsoftware.com/articles/fog0000000069.html


Another favorite example for me BTW is the MS OS/2 2.0 fiasco that IMO is much worse, involving MS using unethical tactics to attack OS/2 later on. And Joel happens to be a former MS employee.


Ah, but doing that was what allowed them to kill IE and restart the browser wars.

Seems like Joel should retract that article.


Nowadays it's much more difficult to be surprised, at least if you're interested in more than just getting grades (which actually rules out plenty of my colleagues). There's so much information online, plus the opportunity to work on OSS projects, etc, that none of that really surprised me when I got my first job a few months ago. Reading TDWTF alone prepares you for the worst ;)


my biggest surprise (i started in the games industry) was just how low the bar for entry really was. i could have started 10 years earlier if i had realised just how truly awful most programmers in the industry are...


My top surprise? It's not uncommon to see people using MySQL with ISAM tables as a serious data store, and then half-implementing ACID properties in the application's GUI layer.

("Why would you want to put data constraints in the database? The data has already been validated by client side javascript, making the database enforce foreign keys too would just slow it down...")


There's a common theme in these posts, along the lines of "OMG I didn't realize paid programming gigs were effectively business roles". Let me address what's intrinsic and what's not.

5. People Interaction

Yup. People skills matter. Programming seems like a great job for asocial people. It's actually the shittiest job imaginable if you have weak people skills, because you get staffed with the crappy maintenance projects that have no upside. (Programming probably has the most variance of any job category; the best projects are a lot of fun, and the worst grind your mind to sawdust and produce nothing.) If no one likes you, no one trusts you, and you'll never get projects where your technical intelligence really matters.

I understand the hate that technical people have for "office politics" but the only approach that works is to adapt. To figure that shit out so you can laugh (silently) at the suckers who don't get it instead of being one of them.

4. Writing Matters

Yes, yes, yes. It really does. My personal opinion is that you're no better a programmer than you are a teacher. We're drowning in technical assets. Most assets are of abysmal quality because the people generating them never took the time to communicate what was being done, how and why, but too little code is not a problem for the software world. We have too much. Too much code and far too much unmanaged complexity.

People who are unwilling or incapable of teaching others how to use the technical assets they've created should not be programmers IMO. That's your job. To solve problems and to teach people how to use those solutions.

3. Software is Never Done

Further reading: http://michaelochurch.wordpress.com/2012/04/13/java-shop-pol...

Sadly, this is more a consequence of big-program enterprise methodologies. In small-program shops, software can be "done". It can actually be finished. The overarching project continues to expand, but this shouldn't be accomplished by shoving more kLoC into a working program and diverting it away from the original vision.

Large, ambitious projects should be structured as systems and given the respect that a system deserves (such as attention to fault tolerance and communication protocol). There's no excuse for most of these large single-program monoliths that ultimately (Conway's Law) become artifacts of parochial corporate politics rather than elegant, minimalistically simple solutions to mathematical and computational problems.

I'm a fan of the Unix philosophy and small-program methodology. Solve a problem. Do it well. Move on. Come back to it if you need to solve another problem. Not unless. I don't like the big-program methodology, under the guise of "object-oriented programming", that seems to have won in the contemporary Java-shop culture.

2. Few Clever Algorithms

Sad but true. The managers' job is to take cleverness out of our jobs, not because they're assholes or don't trust us (usually not the case) but because if our work relied on our cleverness or creativity, then they wouldn't be doing their jobs, which is to deliver business value reliably. We tend to like the high-expectancy, high-variance, fun and creative work, but there isn't much tolerance for this in the industrial world, which would rather produce the low-expectancy "sure thing". The problem is that Big Software actually isn't "reliable"; it's just that the legacy costs are paid out later after all the decision-makers have had a couple of promotions and are far away from the wreckage.

1. Complexity from Aggregation

Yes. This is exactly why I hate big-program methodologies. They inject so much unplanned, unexpected complexity into everything. People should aim for 500 lines of really clean, usable code instead of 20,000 lines of garbage that barely solves the problem and for which no one really knows what it does.


> The managers' job is to take cleverness out of our jobs, not because they're assholes or don't trust us (usually not the case) but because if our work relied on our cleverness or creativity, then they wouldn't be doing their jobs, which is to deliver business value reliably.

Great point. Or as Daniel Geer et al put it[1] (discussing security, but applicable in general): "The central enemy of reliability is complexity. Complex systems tend to not be entirely understood by anyone. If no one can understand more than a fraction of a complex system, then, no one can predict all the ways that system could be compromised by an attacker. Prevention of insecure operating modes in complex systems is difficult to do well and impossible to do cheaply: The defender has to counter all possible attacks; the attacker only has to find one unblocked means of attack. As complexity grows, it becomes ever more natural to simply assert that a system or a product is secure as it becomes less and less possible to actually provide security in the face of complexity."

[1] http://cryptome.org/cyberinsecurity.htm


Fair point, and I agree, but complexity and creativity are orthogonal.

Managers clamp down on individual creativity and cleverness because they fear complexity, and rightly so, because 95% of software complexity serves no value, and only creates frustration and risk. Their risk-limiting optimization tends to suck all the fun and creativity out of the job, although that's not their intent.

Here's the problem: complexity emerges anyway (Greenspun's Tenth Rule). Force people to use Java instead of high-level languages and they'll invent AbstractFactory patterns and hideous, undocumented DSLs in the name of "object-oriented programming". The problem with software is the same problem that exists in legislation: laws are never unmade. The difference is that no one needs to know or care about horse-carriage requirements from 1730 in Philadelphia, but the legacy complexity in software lives on, making everything unreadable and messy.

Moreover, the end-result of all this unexpected complexity is that most software jobs become a legacy slog, which further reduces the room for creative expression.


An excellent comment. Your point number 2, by the way, applies very well to most legal work done by large law firms. Been there, done that. It never hurts to remind people in early career as they leave formal higher education and go into the for-profit, private-enterprise labor force that domain-specific technical skill as such is just a small part of the skill set someone needs to build a successful career.


>Yup. People skills matter. Programming seems like a great job for asocial people. It's actually the shittiest job imaginable if you have weak people skills, because you get staffed with the crappy maintenance projects that have no upside. (Programming probably has the most variance of any job category; the best projects are a lot of fun, and the worst grind your mind to sawdust and produce nothing.) If no one likes you, no one trusts you, and you'll never get projects where your technical intelligence really matters. I understand the hate that technical people have for "office politics" but the only approach that works is to adapt. To figure that shit out so you can laugh (silently) at the suckers who don't get it instead of being one of them.

This is why the best programmers are in startups -- I do not believe you can be good at people and computers at the same time. And it is much more profitable to be involved in people, sadly.


> This is why the best programmers are in startups -- I do not believe you can be good at people and computers at the same time. And it is much more profitable to be involved in people, sadly.

Could this also be one of the _biggest_ reasons why startup fails? Lack of people skill and focusing on technical problems for the sake of technical? Not to mention lack of communication to know the intend of the code seems to be the biggest time sink on programmer's productivity.

Of course on the flip side you can always hire the smarter and smarter developers than the previous guy since you kind of need them to navigate the previous legacy code base. This seems to be the pattern in our industry.


> This is why the best programmers are in startups

Oh, come on. Show your work.

For that matter, define "best." [1]

[1] My definition includes people skills.


Doesn't a startup, founding one at least, require even more people skills than your average job?


I do not believe you can be good at people and computers at the same time.

You can become adequate, and you should. Yes, the monkey-bots running million-year-old legacy code are a pain in the ass (it's annoying being one, too) but you can learn how to deal with them, and you must.

People skills matter more when you're in startups. The business side is all pitch, pitch, pitch, and from a disadvantaged position. If you're technical, you need the people skills to size up a business co-founder, or else you'll end up with a dud, and you need to be good enough with people that he trusts you and takes you seriously.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: