Hacker News new | past | comments | ask | show | jobs | submit login
How to Succeed as a Poor Programmer (psgraphics.blogspot.com)
184 points by Impossible 24 days ago | hide | past | web | favorite | 183 comments



The "Avoid Learning Anything New" advice is insane. (Well, practically all of it is, but that one really stands out).

I think the exact opposite advice is far better: never assume the way you know how to do something is best, and always be on the lookout for what others are doing that might be better.

Here's the thing about learning: the more you learn, the easier learning the next thing becomes. You form links, insights on relationships, new concepts that apply to old things, etc. If this guy thinks learning is such a burden, it's probably because he refuses to learn anything in the first place.

If he thinks he's a poor programmer, it probably has less to do with innate ability and 100% to do with the attitudes he gives as "advice" in this blog.


Let me quote a friend that ranted about this very subject about a week ago:

I am of extremely average intelligence. On the absolute top of the bell curve, looking down at the rest of you. I am extremely tired of the whole attitude that _anything is possible if you just put your mind to it_. I put my mind to it. That is how I got through CS in university. I worked at least twice as hard as the people that said _just put your mind to it_ and then cruised through even the hardest course without ever studying more than an hour a day and then doing an all-nighter to get an essay in.

Working hard as hell to understand what some low-effort-high-intelligence "just put your mind to it" brat understood as soon as it left the professor's mouth is hard. It just shows me again and again that I drew the short straw in the gene lottery. I have to work a lot harder to do the same things.

Being told to just try harder by someone who's idea of intellectual work is "wherever my curiosity takes me" is really friggin taxing on my self esteem.

----------

Now, he was drunk and angry, but there is something to it. I think both him and I agree with you on a matter of principle, but it is more to it than that.


He might be right. But let’s not forget that experience plays a role. Whenever I think of creating crud apps now, I think “ah, easy”. I didn’t use to think that.

Some people start programming at a very young age, by the time they’re in college they have the ability to relate all kinds of concepts. An ability that I have as well now, which I obtained after my bachelor and used in my master.


Man this kinda feels like me, first I thought "anything is possible, just put your mind to it", but after encountering a crazy good programmer at my job I think differently.

I have a bachelor degree, 36 years old, and I feel like I have reached my ceiling. I still think that I make progression, but it is so slow compared to some really good programmers at my job.

It sometimes makes me feel insecure, and I think of quiting my job, because it feels like I will never be able to be as good as others. On the other hand, maybe I should just accept that this is it, it wont get any better.


I was mostly the follow your curiosity type with a few areas I worked hard.. I noticed much later that the reason I impressed teachers and later coworkers is that I'd seen everything before..

CS is an extension of math, a hard science where you seek validation from reality instead of others. If you take the style of the mathematician all comes easy. CS is an extension of linguistics, there might be a style that works there too, but there is also the lost people who count snowflakes instead of noticing snow is all about the same.


He is right though – new tech nearly always disappears.

Example: Learning CoffeeScript was a total waste of time. Learning JQuery helped me for a few years, but now JQuery is basically useless to me.

Based on past experience, I strongly suspect the same will happen with React, Rust and a bunch of other new exciting tech. There are countless examples besides the ones I mentioned.

But on the other hand, the time I put into mastering SQL or Unix will probably continue benefiting me for the rest of my career. Even C will continue to benefit me, even though it'll only ever be a small part of my job.

So I would modify his rule: Avoid Learning Anything New – Learn Something Old


Learning CoffeeScript was only a waste if you never got paid to write / never built anything useful in CoffeeScript.

The failure there is learning something that is not useful to you right now, not "learning something new".

Are their opportunity costs associated with learning one thing over another and making the "wrong choice" in terms of which stuck around longer? Sure. But you can't predict the future. Focus less on what you learned that didn't get much mileage, and focus instead on what you can learn next. Learn new things and old things - just keep learning! In the end, there are very few things you can learn that are a total waste of time - the more things you learn, the more perspective you have, whether or not you are specifically using the exact thing you learned.


Exactly. Also, things keep getting reinvented, often with something beneficial added but usually also something worse than what we used to have. So the perspective of past knowledge can often help you understand some aspect of the new stuff, if you’re also correspondingly open-minded enough to learn the new thing on its own in advance.

And recent experience has shown me that you can both be working with the worst legacy code and CGI has to throw at you and also having to learn pre-release ECMAScript and TypeScript, to use on the same project. Some skills transfer — I’ve never had to fully relearn how to loop through something in a new language once I managed to master both for loops and map/each functional approaches, for example.

Other things are maddeningly poorly developed — tools to help you understand and refactor code are still very much language-specific at the moment and it’s always a one-off to port out-of-fashion languages and syntax to newer ones. Let’s not start on how testing guidelines and TDD haven’t significantly changed since 2003 unless you maybe include SRE work and better “testing in production” techniques... Some knowledge seems evergreen... or stale.

But it’s amazing how when you do keep an open mind, and focus on how something really works at a lower level, it still pays off over time. I still haven’t deployed anything in production with Docker and K8S but that doesn’t mean it wasn’t worth learning, it helped clarify that there’s more to production reliability than simple deployment scripts or imperative commands vs declarative repos — simply knowing it helped me better understand related topics, including additional reasons why immutable, fully-reproducible build systems are a good thing, or how deployment can be seen as a complete system instead of simply “what version of the code is on the servers now?” It also provides a promise of a cloud-agnostic future, and an alternative look at why 12-factor was good, but incomplete.

And there, again, one might say, why did you bother learning 12-factor when most places let you store log files on local EC2 drives, for example? Well, because if you can get over the information overload, having multiple ways of doing things means you’re more likely to be flexible in your approach, or ask “why” when someone says “just do it this way”. It of course has to be balanced with healthy pragmatism, to both ignore the “best option” when it’s too much work, or to recognize that there’s always a chance to continue improving things later on, or as a team.

What’s key is trying to think strategically about your code rather than tactically. A tactician would say “learning nothing new outside of what I need at this moment gets the job done faster” or “this fixes the bug” while a strategist recognizes that there’s always a future opportunity cost, or asks, “but what caused the misunderstanding that allowed it to occur in the first place?”

Also, it’s inevitable that things will change, so you should always either keep up with your specialty or keep expanding your generalist skill sets. And even your knowledge isn’t constant — there’s always something to re-learn if you haven’t used a thing in awhile or in some new way...


So yeah, React may go away in three years. Do you think you would have been better off during those three years continuing to use jquery? Do you think the hours spent learning the new thing were more or less then the hours saved my the new thing having a better api?

(now, of course, you I may be wrong on the cost benefit analysis here. I don't think I am, and the ubiquitousness of the new stuff implies that most of the industry agrees... but no matter the answer, that is the way to think about whether learning something is worth it)


I have a different opinion. Anything learnt will improve your thinking and help you learn new things. A kid who goes on to become a truck driver might never directly use algebra on the job, but learning algebra has shaped their brain and will help in all areas of their life.


Absolutely agree with mastering SQL.

Out of all of the languages I have learned over the years, only one has stayed important in every job and mostly static - SQL

There are many languages and libraries, but at this moment in time, all of them have to talk to a database at some stage, and in order to do that all of them convert the request to SQL.


Yeah true. As a business analyst I have queried many databases, but SQL doesn't change (much), and the dialects share a common base, even for the cloud based ones. My bread and butter.


I prefer learn tech on demand: given a job find the right tools that exist at the time, and then use them even if you don’t already know them.

Besides, having problem context makes the learning much easier, vs just saying...I’m going to learn react today in a void.


> given a job find the right tools that exist at the time

If you don't know said tools beforehand, how can you be sure they are right for the job? When building a new website, for example, should you reach for jQuery? Or is React right for the job? Or is Vue even more right? Or will it be Svelte? Or is vanilla javascript the tool that you really need? How would you know?


> Learning JQuery helped me for a few years, but now JQuery is basically useless to me.

'A few years' is a solid return on investment. Besides the reason Jquery is now useless is because Javascript now has those capabilities, so your skills were 'grandfathered' to ES6/ES7.

Learning an applicable skill is almost never useless.


Yeah, I'm not saying it wasn't useful at the time, just that it's not useful now. As opposed to learning SQL or shell scripting which will probably be useful my entire career.


I'm convinced of 'learn old > learn new' in general, but I think there are some interesting edges: the older something is, the larger the gap (maybe chasm) will be between 'basic competence' and 'venerable expert'. Also, the more likely all the common problems you'll face will have been posted on the internet, and that the quality of tutorials and explanations will be superlative.

While the payoff of learning old > new is typically much higher (search: Taleb Lindy effect), I think matching your learning to the 'human api' is more important. For me, learning is very emotional: when I feel a sense of curiosity and intrinsic drive to know, I'll follow my nose, spend my time where it takes me. I want to spend this kind of energy in a certain way.

When I find myself with an instrumental cause or external need to know, it's most likely going to be because it's something typical, something old, in which case learning about it and being useful with it will require less of my spirit and drive to crack.

New language/tech fanatics tend to pressure through both sources (a la "look at our ingenious design breaking paradigms" and "it's so good your boss will want you to work in it (well, soon)"). Often the former argument is stronger, so it appeals to your curiosity - in which case you're best off searching for the useful kernel, followed by a swift exit in order to preserve your sense of discernment. Should you return there, provoked by your general interest, that ought to be your indicator of importance. How hyped you felt that one afternoon you learned about it after reading HN comments is likely not.

On the other hand, the 'should learn' brigade will tend to target the latter source of pressure (employability & centrality). If you find yourself feeling resistant to this and force yourself into it anyway, you'll easily burn out and douse your curiousity for the day. I've arranged learning resources to languages I find fundamentally dull multiple times, and made only very shallow dives into them.

When choosing what to learn, your built-in heuristics will tend to serve you much better than either a long list of common 'shoulds' or a tangle of overhyped 'musts'. Let natural forces do their thing in shaping what things will be presented to you: avoid paying much attention to the loud people where marketing, shills and zealots tend to roam.


I made the switch to vanilla JS from jQuery a year or so ago and it's great to see how jQuery (and other libs) pushed vanilla in certain directions. But I'll always miss $('#element') vs getElementById('element') and similar shortcuts.


I agree about 90%. Sometimes though, it pays to learn the weird or new stuff. Sometimes, you can make a lot of money knowing the next big thing, even if it goes away in a year. You have to be well positioned though, like being an hourly contractor.

I'm generally the guy who knows the boring shit. C, networking, 'Linux', hardware, RF... But I keep telling myself I'm going to jump on the next bandwagon just to see how the ride goes.


The concepts behind some things like React will last. I have seen several incarnations of GUI event frameworks and once you learn one, it is easier to learn others.

That said, some technologies are fads that learning them for the sake of learning doesn't have a big payoff. I learned TCL at one job and have never used it again. There will always be things you learn for software development that proves to be dud. Get used to learning and be happy when something has long term success. It is vary hard to predict that the long term success will be.


> but now JQuery is basically useless to me.

Probably not, because even if you don't use it today, the problems and solutions you learned while using it probably made you a better developer and you know what was good and not very good about it, so even if you are making something with React, you probably remember the "old jQuery days" and don't make the same mistakes.


I assumed unix would last, but lately there is the concept where “the cloud is your OS” and I hope that isn’t true. I like Unix. However, I’m wondering just how much of what I type in the future will revolve around cloud apis and tooling instead


> Avoid Learning Anything New – Learn Something Old

That is exactly how I understood it.


I disagree that learning any new tech is useless in general because you found two cases where it was. Any old tech was new at a point. What matter is learning new ways of thinking (paradigms?).

Then you see that syntaxes and framework API changes but it’s not a big deal because you already know what you want to do, look it up in the docs, and can pick up new languages in a few days.

For example learning high level concepts like filter-map-reduce with the appropriate data structures is relatively language agnostic, so I have the same reasoning between languages and all I see is syntax choices. It greatly improves learning speed and then I am free to choose the best tool for practical considerations for each project.


Learning CoffeeScript was a total waste of time.

In what way? I too jumped on the CoffeeScript train, but considered it on the whole time well spent. First of all I wrote a couple of useful things with it so there is that. But more importantly having gotten used to the CS way of doing things made it much easier to jump to ES6 and many of concepts from CS could easily be re-used.


> Example: Learning CoffeeScript was a total waste of time.

Given how CS inflenced the ES6 (and later) revisions of Javascript, I find that hard to believe unless you stopped writing CS and went back to writing JS exactly as you did before.


Reminds me of a programmer I met about 7 years ago. He was in a team of 4, writing internal software tools for a company.

He said he was annoyed that he always had to manually join other peoples code. They would send the files they changed to him, and then he would search what changed, and add their changes to his own code.

I asked why they didn't use source version control like Subversion. He asked me what it was. He was so happy to hear that such tools exist, and was going to look into it further.

And I on my part, was perplexed that none of those 4 programmers knew about it.

If you don't regularly read shit to improve yourself, maybe it's time to find another job.


Or maybe the rest of the team knew but never bothered to make any change, because why?


Agree. Everyone is a poor programmer when they start. If they all thought like this, we would have zero good programmers.


You're going to look back on this comment in a decade or so and chuckle at the hubristic immaturity of your youth. With any luck, you'll accomplish half as much as this poor programmer.


I think there is a lot of disagreement about what 'learning' means and how long it takes. I while back I got curious about Elixir. So I downloaded it, skimmed the documentation, worked through a couple of tutorials and wrote a couple of trivial things. Took me half a day. I now feel I have a pretty good feel for what Elixir is and what its strong points are. Absolutely doesn't feel like a waste of time, even if I never write another line of Elixir in my life.


If you consider 'new' to mean it is less than three years old instead of meaning it is unfamiliar then the advice is not insane.


He's a C++ programmer, so I assume he can also code in C. So do I.

When you go so low-level, then everything else just translates to "overhead" and it's always a trade-off (for example exchanging performance with "fast coding" or "easy of use").

At that point, you know there is nothing better, if better is understood in terms as "performance" (I develop firmware), so there is little room for improvement. Maybe rust? I don't know. It may just be another trade-off.

"Ease of use" and "coding speed" are things that should be interesting to our bosses in order to save a few bucks.

> others are doing that might be better

Of course, in my case I am always open for new techniques: I'd be rather looking and understanding how John Carmack splits and scans BSP trees instead of thinking that X language will do it for me (at the cost of "insert trade-off here").

But no. Just as the author, I'm not going to learn node.js because "new".


> He's a C++ programmer, so I assume he can also code in C.

I would make no such assumption. Most C++ programmers I've known would have a hard time being productive in a language where they had no collections, templates, virtual functions, destructors, lambdas, exceptions, etc. Trading deep knowledge of how C++ manages memory for knowing how to do it yourself might be even more of a challenge. A lot of C++ programmers know C because they were around as the two evolved, but at this point there are many who entered the profession with C++ and would find C almost as confounding as COBOL.


In the article he says he learned C++ in 1995, so he probably knows C.


He says he learned C++ in 1990, and never says anything about C, so it's still likely that he only has a distant memory of something like C. I have a distant memory of something like Ada, but I might struggle for a while if asked to program in Ada professionally.


I bet 1990 C++ was for most people C with classes.


Pretty much, IIRC. I wasn't exposed to it until about a decade later myself, but my wife was. Most or possibly even all C++ compilers were still front ends for C compilers. Exceptions either didn't exist or weren't considered reliable, so her company rolled their own using setjmp and longjmp - which don't do stack unwinding so forget about relying on destructors and RAII. Similar for templates, so no STL. Does anyone even remember the NIH class library any more? It was very stripped down compared to the cancer it has become.

So yeah, somebody who knew C++ in 1990 was probably within spitting distance of knowing C at the time ... but now, they know it the same way I know Lisp or Prolog, since I used those long ago too. But if they've stuck with C++ ever since, do they effectively know C today? Could they be productive in it more quickly than they could in Rust or Go which they'd never even seen before? That's not clear at all.


In the mid 90's, I was very surprised to discover that a popular telnet program widely distributed in Linux distros was written in C++, with cruft like abstract base classes with pure virtual functions and whatnot.

(I was looking at the source because the stupid thing would disconnect whenever the PPP connection went donw; it was over-reacting to EHOSTUNREACH errors on the socket, which are recoverable; I fixed that, at least for myself.)

Here it is: https://github.com/marado/netkit-telnet/tree/master/netkit-t...


I mean I’m a C++ programmer too, but I learn other tech so i dont end up myopic in how to best solve a problem. You shouldn’t learn node.js because its “new” (also: its definitely not new anymore), you should learn node because c++ is bad at most of the things node is good at (and vice versa). (and before you think im picking on systems programmers, I also have a dim view of web developers that refuse to learn any non-web tools)


While not agreeing, as I rather advice people to focus on T-shape knowledge and being polyglot developers instead of siloing themselves as "X Developer". there is something that kind of relates to that advice.

Focus on platform languages and be a late adopter, it saves a lot of frustration advocating stuff that eventually never takes off, or happens to die and then one is stuck doing maintenance work.


I don't know. I don't think he's really making such an absolutist statement, these are heuristics, I think they're meant to be guardrails.

ALAN is probably a very good heuristic for precisely the type of person who reads HN. I would be most of us err too far in the direction of learning new stuff, while letting the projects we've started stagnate.


At risk of appealing to authority, I think you should look into exactly who Peter Shirley is before sounding off.


On the one hand I think it makes sense to put off learning anything new for a few years unless you really think whoa this is going to have staying power.

On the other hand I'll repeat an anecdote I've put out here before - I had a friend I was in a JavaScript course with in 1998 and one day I went over to his house and he showed me the really beautiful table-based demo site he had made to entice employers and I showed him how CSS worked at which point he thundered "That's the problem with this business, there's always something new to learn!"


He is an authority in a field adjacent to software development, but not actually software development. He's a researcher who develops things that happen to be implemented in software (or hardware, given what nvidia does), and "avoid wasting time learning new programming languages" is probably excellent advice to someone in that position. Most HN readers are not.


I had never heard of him, so I took the advice of the previous poster and looked at Wikipedia [1]:

> After one semester at the University of Illinois at Urbana-Champaign, he transferred to Reed College in Portland, Oregon, where received his BA in physics in 1985, and then received his PhD in computer science from the University of Illinois, Urbana-Champaign in 1991. He then joined the faculty at Indiana University as an assistant professor. From 1994 to 1996 he was a visiting professor at Cornell University. He then joined the faculty at the University of Utah, where he taught until 2008 when he joined NVIDIA as a research scientist.

I think he probably knows more than you give him credit for.

[1] https://en.wikipedia.org/wiki/Peter_Shirley


Yeah, I don't really care who he is. Nor do I think I should: his advice should either stand or not based on its own merits. There are plenty of successful people that give terrible advice.


It's more like "avoid learning novelties" than "avoid learning". Give technologies time to be proven, for their strengths and weaknesses to become known. Wrangling the cutting edge is its own set of skills.

It is phrased a bit ambiguously for shock value, I think.


Indeed, you always have to look for new ways to improve, but you don't have to change the way you work, Likely you are doing a lot of things well. I think better advice is, be curious to learn some way to improve, be conservative in applying it.


Yes "avoid learning anything new" is a bit weird.

There are some things that are relatively easy to learn, and would make it so much easier on you. How, as a poor programmer, can you afford not to learn them ?


Probably just poorly phrased.

Something like "there are more new langs than you have time. Choose wisely"


This is such a bizarre post. If you have the self-awareness to admit this about yourself, you are probably not a poor programmer.

Poor programmers do things like allowing an incoming request to spin up unlimited concurrent threads. Poor programmers erroneously throw exceptions on any operational deviation - even it if can be handled without error.

Most importantly - poor programmers do not learn from their mistakes and are unable to see that they are poor programmers.


I know many a poor programmer. The consistent thread between them is that they don't appreciate the fact that they don't know everything. You may be very well versed in a tech stack, but there are undoubtedly architectural implementations or design choices or algoritms which you simply don't know or haven't considered. A good programmer recognizes the situations where they cannot come up with the best solution and makes the necessary safety changes so that when the best solution does reveal itself, it's easy to swap out the old solution.

Another trait is the belief that they either never need to touch their code after they write it, or worse, only they will ever touch their code. It fires me up to see code that was written with the complete intention of never coming back to it again, despite the fact that it has to be maintained to keep the business running - reports have to use the data it generates, other applications have to use it's services because we can't afford to keep re-implementing everything. I also really hate seeing code that seems like the dev locked themselves up in their basement for a decade to write it. All sorts of custom switches, hardcoded configuration values, experimental tech embedded so deep the only graceful way to relieve that debt is to nuke it.


Why should being aware of how lacking you are as a programmer have much bearing on whether you're good or not?

When I started programming seriously I had a mentor who gave me a long list of my shortcomings. It's taken years of hard, dedicated work to rectify most of those problems and I still have a huge distance to cover to be as good as I want to be.

Being a good programmer is a skillet acquired with hard, concentrated effort over time; not just a good attitude with some self-awareness.


In general, people tend to be overconfident in their beliefs and too slow to incorporate new information. There is experimental evidence to back this up (pdf warning: http://www.researchgate.net/profile/Baruch_Fischhoff/publica...). I think this applies to engineers as well as anyone else.


It is somewhat semantics but there is a big distinction between being aware and unaware of your failings that has a real impact on your output.


OP said "probably not" not "certainly not". Most people do not have a mentor to tell them everything they are doing wrong so they are probably not aware.


> Why should being aware of how lacking you are as a programmer have much bearing on whether you're good or not?

Being aware of what you are lacking allows you to improve on those things, work around those issues and accept when collegue is actually good at stuff you are not good at.


> Poor programmers do things like allowing an incoming request to spin up unlimited concurrent threads. Poor programmers erroneously throw exceptions on any operational deviation - even it if can be handled without error.

All of these examples make sense in the context of your business, but in some environments these might be best practice ;)


There's different levels of concern for a public API and an internal interface. If you control all publishers and subscribers, then you don't need to worry quite as much about sanity checking inputs or making every possible workable input produce a valid result.


> but in some environments these might be best practice

What are those environments? Can you be more specific?


There are plenty of environments where deviating from spec (even when fixing an apparently trivial bug) would not be ok without assessing the situation for potential unintended behaviors.

High-risk systems where a software bug could result in loss of life - aviation, submarine tech, defense, medical etc. Edit: This is for throwing exceptions on undefined behavior.

For unlimited threads I imagine scenarios exist particularly when scalable computing is so popular. Large high traffic web stores such as Amazon/Ebay, financial institutions, etc. Either way, the problem shouldn't be solved by an individual developer on a project making things up as they go - its a problem that probably needs to be defined at a framework layer, and discussed within the team of developers and requirements definers.


I don’t know what they had in mind, but certainly Erlang encourages those sorts of approaches.


Awareness is one important part of self-improvement, but taking the initiative and doing something about it is another thing.


A poor programmer wouldn't be on HN and the point about never learning anything new reinforces that. Most of the article reads like satire.


That's a major assumption. Many get curious when a plumber or other blue collar worker talks on an hn thread. This isnt a secret clubhouse, its a new forum. Just...tech focused.


Getting to that thread regardless of the subject assumes some level of intellectual curiosity (however low that may be). That seems to be the thing lacking among all the worst devs I've had the misfortune of working with.


It's a great, succinct post. Deeply uncool. A programmer should be modest about their skills, skeptical about new-new things, eschew bullshit, and terrified of dependencies. I buy the whole thing.

Moreover, I trust the advice of someone who rates themselves poorly more than someone proclaiming that they're a hotshot.


ALAN. Avoid Learning Anything New

Terrible advice and mindset, It's good to learn for pleasure or curiosity. This way, you can enjoy reading SICP, effective java, code complete... Or you can use a new system, Linux, Mac, Android, iOS. Doing it you model your thoughts and mind, learn new ways, practices, or patterns, you don’t have to use then only for having learned, but even so, they will be useful to you.


It's not terrible advice. You mentioned SICP, funny thing is, years ago when I put someone on to SICP, next thing you know in our production code we started getting these weird ass recursive functions..... Also had similar issues with Design Patterns, all kinds of overly engineered class structures started popping up. On the side of ALAN, I used to work with guy who did a lot of machine vision research, he coded everything the way he knew how for years, and he was super productive doing it. I didn't really like the code. But he got stuff done. Having said all that, learning stuff is still good, practicing stuff on non critical code is the next step.

Taking some lessons from BJJ ( Brazillian Jiujitsu ), when you compete, you go in with your A game, the things you have practiced, the things you have made work over and over again, your high percentage moves. Over time, you add things into your A game as you gain experience in making those techniques work. You may occassionally find yourself in an odd situation which some "new" technique is screaming to be used and you might try it out. But usually, when you encounter a situation you don't really know, you start working at getting the problem to change to something you do know, then throw your A game at it.

I think programmers should understand what their A game is.


I would gather though that the best way to learn new things is to apply them and use them in the code. That's the only way to get seasoned and find out when and where to use these things (like design patterns, recursive functions, etc).

It sucks when this is happening to a production system, and then maybe it's a mentoring & leadership problem instead.


You lost me at "next thing you know in our production code we started getting these weird ass recursive functions...". I don't mean to be mean, but recursion is pretty fundamental to a lot of algorithms, and it sounds like the SICP coder was just writing stuff more advanced than you were comfortable with.


Anything that is more advanced than necessary is too advanced to be comfortable with.


Might also be an inappropriate application of recursion in a language where loops are more common. I've seen people who learn new stuff go overboard with clever code in production that is less readable for their peers.

Recursion shouldn't even be considered advanced though, just like classes and first class functions. I'm all for making code as boring as possible, but to me understanding at least these concepts is a requirement of entering the profession of software development.


Sometimes recursion is the good solution, For example a back-tracking solution is more simple with recursion than without it.


think you kind of missed what I was saying, I was the one recommending, having already done a bunch of functional programming, and thought SICP was a nice intro ( at the time ) to functional programming. But they didn't quite get it and their code was a weird mish mash of OO and weird uses of recursion with badly thought out (stateful ) functions.


But doesn't putting somebody on SICP in the hopes of helping them learn functional programming (is that SICP's concern?) sound to you as... aspirational advice? The book is as famous as it is infamous.


the point is more that anything you learn, you often want to try out the new toys. But those toys aren't your A game and we haven't learnt the lessons yet of applying things to achieve real results, so often that can skew our productivity. Whole new frameworks can leave us completely incapacitated when we hit the edges of what is commonly done until we either dig through things ourselves or find someone to answer our questions.


If you read what he actually wrote, he obviously means "Avoid Learning Anything Newly Invented/Created" not "Avoid Learning Things That Are New To You". In that context, SICP, Java, and Linux hardly count as new...


That's a charitable interpretation and good advice, but they specifically say "Only learn something new when forced" and give two examples 25 years apart. It certainly sounds like they aren't learning technology as it matures but literally as they are forced.


I don't know why so many people here assume that learning something ages ago and sticking with it is mutually exclusive to learning anything tangential. Those two examples are programming languages and their accompanying ecosystems, not switching those for 30 odd years is not an indicator of having stopped learning anything.


"All new ideas are bad" seems equally bad as advice.


I don’t think it’s that “all new ideas are bad” but more that it’s not always efficient to be the guinea pig. I have a pharmacist friend who says he tries to avoid taking any medication the first five years it’s on the market. Not because he thinks it’s bad but because he’s seen enough new medication to know that it takes about that long to really get a good picture of the risks and interactions.


"Wait for technologies to mature" sounds like a great heuristic for people who need to build things that actually work.


And for people who have better things to do than relearn a new JS framework every 2 years.


I dont mean to be offensive, but usually those things take like a month to learn if you're bad at it. People talk about that grind like it's anything other than the most basic of gestures for tracking the rapidly moving target of browser technology.

If this is what we mean by, "being a bad programmer" I guess I get it now. A refusal to actually track the state of the industry, instead being told by employers what matters.


> A refusal to actually track the state of the industry, instead being told by employers what matters.

Yeeees... so instead of being "told" by your employers, you're "told" by the hype-train? How exactly is this better?

Being fed up of the constant churn in JS frameworks is an entirely valid position.

Sure, maybe it only takes a month to learn the latest framework. Maybe it only takes a couple of weeks. But maybe I'd rather spend those 2 weeks doing something else that I consider to be more valuable (it's called opportunity cost).

And what about all the gotchas and quirks that every framework has? The pathological performance edge-cases, and suchlike? The ones you only discover after weeks and months of in-depth use? I'll have to learn a whole new set of those.

And what about my "legacy" codebase that used the last framework du jour. Do I just ignore it? Do I convert it? Hmm. Wonder how long that will take, and what else I could be doing with that time.

Maybe you enjoy the churn: endlessly learning useless knowledge that will be of no value to you in a few short years because it's no longer trendy. Lucky you. For me it got boring, because I've got stuff to build.


> Yeeees... so instead of being "told" by your employers, you're "told" by the hype-train? How exactly is this better?

Because occasionally our peers are right? Do you really have so little respect for everyone in the industry around you that every new piece of tech that comes along, you just assume it is a mass of incompetence and marketing?

A diversity of perspectives, ideas and approaches is a fertile ground for personal growth. There are never any shortages of such hype trains.

And at least they're made by fellow software engineers. Not, you know, corporate hiring comittees.

> And what about all the gotchas and quirks that every framework has? The pathological performance edge-cases, and suchlike? The ones you only discover after weeks and months of in-depth use? I'll have to learn a whole new set of those.

Getting domain specific, you'd be doing that anyways because of how rapidly browsers are growing and changing.

> And what about my "legacy" codebase that used the last framework du jour. Do I just ignore it? Do I convert it? Hmm. Wonder how long that will take, and what else I could be doing with that time.

> Maybe you enjoy the churn: endlessly learning useless knowledge that will be of no value to you in a few short years because it's no longer trendy. Lucky you.

Why is it that the value of software is defined by if it is trendy 5 years later? That's a conflation of concerns I can't follow.

> For me it got boring, because I've got stuff to build.

For me, squatting on one stagnant pile of never-really-that-good technology building the same boring things over and over again at the behest of others is equally boring.

You suggest all the frameworks are poorly designed hype, but then decide you want to take an arbitrary moment in time (when you showed up, that fated day) and freeze everything there.

That reasoning seems unconvincing.


If you're unnecessarily spending a month a year on platform churn, that's nearly 10% loss of productivity right there. Doesn't seem like a good choice to me, unless you're genuinely gaining something important.


This was about learning, not re-engineering. Why are you conflating the two.

Lest we forget, the opening post advised you never learn anything new.


You can still learn the latest technologies so that you know what's potentially coming down the pipeline, while using the best mature tools for your bread and butter day to day work.


It also sounds like an excellent recipie for being utterly beholden to every industry trend that gains critical mass despite numerous examples of how sub-optimal it is.

It's a strategy for a big corporate hiring committee, not an individual worker.

Of course the actual successful software companies "build the next proven thing" so even for them, this "wait and see" strategy is clearly not effective.


I didn't get that from the article. It's more like: most of the new tech won't stick and if you aren't smart enough to discern what will stick and what not, then focus on learning tech that has already matured and been adopted. Remember, the advice is meant for programmers who aren't that good.


I think it's a tongue-in-cheek way of saying "don't chase the latest framework/paradigm/trend/fad unless you have a solid reason to do so".

There's always an exploration/exploitation tradeoff when choosing your tech stack, and in my opinion most programmers overinvest in exploration, given that the terrain is infinite and covered in roughly equal local optima.


It depends. Some things are visibly awkward, unpleasant to learn and use, and clearly short-lived. It only pays to learn and work on them if you are paid a lot of money. My pet example from the 1990s is Win16, which I skipped entirely for Win32. Many 4GL languages from the 90s stayed there too. Every second of time dedicated to Javascript frameworks before React/Vue was probably a waste of time. (OTOH learning the bare-metal JS paid off well.)


Ember is still alive and kicking ;-)

Also, having spent time on frameworks before React/Vue makes you really appreciate what React/Vue do for you.


Angular?


Angular is disputable, remember AngularJS? :)


While it might be a bad rule as stated, I took it as a tongue-in-cheek way of getting at a much better rule: Avoid Learning Everything New. Or maybe, for the sake of a better acronym, Stop Learning Ephemeral Dreck. It's certainly good to expose oneself to new ideas. Sometimes it's worthwhile to dive deeper into something different. The problem is that a lot of programmers make neophilia into a lifestyle, forever distracting themselves and never learning anything really well. Scratch the surface, get some quick wins at something that's still new to everybody, move on to the next thing. Avoid any domain or technology where true experts can show you up. That's just another way to succeed as a poor programmer. Good programmers have at least some breadth and some depth in a few areas.


6. Be aware that most coding advice is bad. Think about whether there is empirical evidence that a given piece of advice is true.


Indeed, Usually, all coding advices are too emphatic or clickbait. ex: The 10 languages you have to learn in 2020. The most important js framework. The 10 books you must read.

But in reality, all advice depends, they depend on what are you doing, what you know, what is your experience....


In other words, "Only I hold the real keys, everyone else is lying to you. Trust only me."

No, sorry. Invalidates the entire post, #3 notwithstanding. Such a cheap, low effort, lazy cop-out when writing about any topic.

What the hell was this guy thinking? I'm certain he's got better wisdom to share than this.


Personally I don't think that's necessarily what he meant, and I don't think it invalidates the entire post. I do wish he added more detail though. He doesn't come off as an asshole to me, like your comment seems to imply.

If he were to say that most coding advice was noisy... I mean, I get that feeling too. There's lots of different approaches to the same problem that work in different scenarios, different work environments, domains of expertise, constraints, etc. And yet lots of coding advice is similar to, "<always/never> do X". Unilaterally. Period. And you will end up receiving lots of conflicting advice.

It seems like the smarter thing to do is take everything with a grain of salt. You don't have to change everything you do as soon as you read a new blog post, it's just a different approach you could add to your toolbox, then use it someday if it makes sense.


He's not an asshole, but he gave in to the lack of any kind of rigor required to make a blog post. We're all guilty of it, but I hope we're all also willing to call each other on it.


I spend at least two hours a day on personal projects, for the pleasure and the challenge. They just have nothing to do with programming or software engineering, i.e. my job.


Yes, it's a terrible, second-stringer attitude. Curiosity and obsession are a couple of the key differentiators between a workaday jobber in it for the money (like many who flocked to tech in the dot-com times) and a badass. TBH, if someone's only in it for the money, they're wasting their life in the wrong field when they could be doing something else such that their morale and satisfaction would be greater.


Ahh yes, I'll be sure to let consults notes Distinguished Scientist at NVIDIA and adjunct professor at the University of Utah Peter Shirley that he's a second-stringer and not nearly "badass" enough for Hacker News commentator anon91831837 and he's wasting his life in the wrong field.


It is not like distinguished scientist credentials necessarily correlate with exceptional code quality or software development practices. And it is not like NVIDIA is particularly known to be a place that excels at such.


It's actually sad, his credentials suggest he's capable of a much better post than this...


Either the post is intended sarcastically, or there's something to learn from it.


> Curiosity and obsession are a couple of the key differentiators between a workaday jobber in it for the money (like many who flocked to tech in the dot-com times) and a badass.

Unit tests and sensible comments are a couple of the key differentiators between a workaday jobber in it for the money and a badass. And now what? While I kind of agree that some people might be in the wrong job, that's true for every profession and the comparison seems lacking. If we're talking day job programming, a badass is neither a requirement, nor desirable in most environments honestly.


Two things jump out to me:

> Only learn something new when forced

I think there is a balance between always doing things in a new way, versus always doing things as you've done before. When engineers are pushed too hard on deadlines, some will avoid learning new things as a short term approach for quick delivery. If your in that environment, you aren't going to grow.

> Avoid linking to other software unless forced. It empirically rarely goes well.

Source? The rapid growth of npm, rubygems, and other ecosystems suggests otherwise.

I was hoping this would talk about how to support your co-workers (code review, culture, cohesion) or how to succeed at non-engineering tasks other 'good' programers may overlook


> Source? The rapid growth of npm, rubygems, and other ecosystems suggests otherwise.

Probably referring to the dependency hell that can occur. Things work great when you first choose your libraries, but then months or years later while maintaining code you find x feature is deprecated or lib y no longer plays nicely with lib z or lib a is dependent on lib b version 2 and lib c needs lib b to be version 3, etc.


Learning is pretty simple, the golden rule is "Don't learn new tech, learn how to solve new problems".

Learning how to use something like Vue when you already know how to use React (or vice versa) is stupid because they both solve the exact same problem in a reasonably similar way with a reasonably different API.

A better example might be something like Postgres and DynamoDB, since it can go either way. If your problem is 'I need a database for a CRUD app' then learning the second one is stupid because they both solve that problem just fine. But if your problem is 'I have a complex use case, my data is in a bad format for the one I'm using and I'm taking a huge hit in performance' then learning the other one is a reasonable choice and probably not a waste of time.

Basically whenever you take the time to learn something, make sure you're getting something out of it in terms of end results. It feels good to just learn more of the same tech, and if the API is different enough it'll feel like you're making progress, but you're probably not.


> Learning how to use something like Vue when you already know how to use React (or vice versa) is stupid because they both solve the exact same problem in a reasonably similar way with a reasonably different API.

Settling on Mithril as my front-end library was a long journey from framework to framework. If I had stopped at Vue or React, I'd be much worse off for it.

Really, if I had stopped at the first front-end library/framework I used in web dev, I'd still be using PHP. Sometimes you need to move on, and if later asked about your technical choices in a professional environment, you need to have a professional answer that comes from wide experience.


> Really, if I had stopped at the first front-end library/framework I used in web dev, I'd still be using PHP.

Well, I did specifically give an example of two techs that do similar things that might be worth learning if they're different enough that one solves a problem the other doesn't.

I don't know anything about Mithril, but if you're right that you'd be 'much worse off' then that would be a similar case, no?


One could argue that it's not "different enough" seeing as how it ostensibly functions the same as React from a bird's eye view. However it takes learning both to understand the strengths and weaknesses, an uneducated assessment wouldn't be enough. All I'm saying is that if you're curious, find out. Don't drag your company into it but learn in your free time. It doesn't actually take all that long to pick up a framework.


I think I interpret "Only learn something new when forced" is more like you should prioritize what to learn to do a task. Learning new things to learn new things is different than having a concrete project that emphasizes that you should learn one new thing. Learning the newest system of the week isn't really going to help you as much as a tried and true system that is well maintained and proven to work.

The second part about linking to other software is valid. What if the thing you are linking to is not going to be maintained? What if the thing you are linking to has a security issue? Now you are probably going to have to either link to something else or replicate the functionality. By only linking to things that are essentially hard to replicate instead of blindly linking to anything else.

In the above examples it is more of striking a balance instead of mere absolutes.


I used PHP where everyone would reinvent the wheel and JS where everyone would install hundreds of packages.

This gave me the feeling that the truth is somewhere in-between.

Maybe NPM packages are overall low quality and it would be okay to use more of them if they were better, but I'm now just installing a package when I don't have the time or skills to code it myself. This saved me from dependency hell, but it also helped me to move faster than doing all on my own.


I feel like rails is the good in between. It contains loads of helpful functions and tools to get 90% of the stuff you need done but its all part of one package so its all tested together and comes from one trusted org rather than 1000 random js devs.


> Only learn something new when forced

This could be devastating, much more in little/medium company. I know of successful companies (I mean company with successful products ) with a C/C++ stack they considered "good enough" so they didn't change anything: the C++ standards, architecture, structures. Often that line of conduct was supported by a management looking any change or improvement as a cost. The result is always the same: a day they wake-up realizing that the "product" is a pile of crappy legacy code. I know some cases. One of those company was bought by a bigger company that asked to modify the stack to modern standards with disastrous results because the programmers wasn't skilled to port the code base to modern standards/architectures. In another case, the owners sold the entire division to another company, interested to the clients more than to the product and, after checking the status of the code base, the buyers hired a group of consultants that rewrote all in Java, with results that you can imagine.


Just because something is old or uses old framework or old standard doesn't make it crap. Crappy code is crap, but it age doesn't make it crap. Code doesn't rust or detoriarate by itself.

Example if you write good code now using C++14. If it's good code now it will continue to be good code 20 years later. There's no property that adds bugs or "crappiness" to the code as the years go past. The invention of a new "c++40" standard doesn't obsolete old code or turn it "crap".


The age make it "crap", but not in the way you think. I can speak about C++. C++ code wrote in the older standard, let say C++ pre-11, who are 20 or more years old make it difficult to maintain because new generation of programmers are not generally interested to legacy code or in learning old c++ standards and skilled programmers don't want be relegated to be eternally maintainer of old projects. For this reason projects die by asphyxia. We have that good code base in Cobol, ok, but the point is: who cares ? Who is important when you need new programmers. Besides, a part considerations on the language, the architecture is important we live in a world of microservices and old code not updated to modern paradigms could became "crap", even if if the most elegant code.


The best thing about seeing the codebase for a long lived product is seeing the layers of when something was written by the types of patterns found in the code.


This is probably really bad but I totally empathize (and agree?)

KISS => everyone should do this

YAGNI => 95% of the shit I add is ignored (and I've got fairly objective proof that I can develop good projects)

ALAN => Master SQL, one scripting language, and how to use Stack Overflow. You'll be the most useful dev on the team.

I agree with most of the rest.

Bonus point: never be afraid to tell a business person that what they want is an exceptionally bad idea. it usually is.


I don't think most of this applies only to "poor programmers".

The advice about KISS is something many "good" programmers would benefit from following. Likewise, I've seen good programmers recommend using arrays as your first attempt at a data structure as well (e.g. Jonathan Blow, https://www.youtube.com/watch?v=JjDsP5n2kSM )


I’ve been in a rut lately. Missing obvious things, shipping less than my best code. It’s come to my attention that I’m not as good at programming as I am at crafting database queries and tuning them but that’s such a small niche given how easy it is to pick up SQL that I’m having an existential crisis.

So I’m working on getting better and getting more confident.


You could make a career out of just SQL. I know my company could use a decent SQL programmer. We’ve got hundreds and hundreds of procs written by SQL amateurs


Correct. Tuning SQL queries can deliver performance improvements of several thousand percent, in exchange for just a few hours' work. EXPLAIN is your friend.


That’d be a lot of fun. I find so much joy in that actually. Figuring out what variation of a query satisfies the query planner to create the most performant plans is fun!


>I’m having an existential crisis

Don't. There are so many poorly performing queries and databases in the world that you could make a very tidy income just specializing in sorting that out. Even if you knew nothing else that skill will keep you employable for decades to come.


Or more to the point, don't define yourself by your work.


It's easy to pick up anything. It's difficult to master most. If you're good at SQL, why not transition to be a full-time DBA? In the right companies, good DBAs are highly valued.

I suspect there is quite some business helping companies wanting to transition from Oracle or MongoDB to PostgreSQL.


I was a SQLServer DBA for a while and found the administration of things so boring. It was the crafting of queries and the designing of schemas that was the most fun.

If the administration part is a necessary tax to be paid then so be it if I get to do the other part.


> So I’m working on getting better and getting more confident.

FWIW, I'd say the most important characteristic of terrible programmers is their supreme unearned confidence!

Which is to say: Focus on getting better. The confidence is secondary.


The confidence is for sanity reasons purely. There’s nothing more demoralizing than agreeing to do a seemingly simple task and then failing. So I’m getting better so that I have the confidence in my self


I'm a quite senior engineer, and I fail routinely at tasks.

My secret trick is simply to ask people for help when I'm stuck.


I will say the support to my thread has received has been super nice and helpful.


I totally agree with this stuff.

I was a professional programmer (now retired), and not a very good one. I'm familiar with Dunning-Kruger; I've worked with good programmers and bad ones, and I can tell the difference. Very good programmers are far and few.

I noticed that most of my colleagues were keen to learn new shit, like new JS libraries, new languages, new source-code management systems and so on. I think I lost interest in newness (for its own sake) about 15 years ago; I got turned over one time too many by a vendor that decided to withdraw support for a programming language that I had committed myself to.

I recommend retiring from programming. You don't have to keep up with the young whippersnappers any more, you can carry on coding in bash, you don't have to use git, Docker, or weird NoSQL systems. I realise that some of this new-fangled stuff is better than FORTRAN or VB6 or whatever; but learning a new programming system every 6 months is a total waste of time and effort. Get to be good with a few useful tools, then concentrate on people skills.

Or give it up completely, and learn cooking, or drumming, or interior-decorating.

I think there may be some rationale behind ageism in software development. For the first 25 years or so I got better at it, but I think after I turned 50 I started getting worse. Or at least, I got better slower. It took me longer to learn new tricks.

But I really think that some of those new tricks were not worth learning - for example, you can stuff Node and that ridiculous dependency system where the sun don't shine. JS is a very clever language; but cleverness isn't always best.


> you can stuff Node and that ridiculous dependency system where the sun don't shine.

You're really taking embedded computing to the next level


If you ALAN you can't KISS. You won't know what the simplest thing is. You'll end up with miles of nested "ifs" instead of 4 lines of recursion. A gazillion "else ifs" instead of case etc.

He's got one thing right. He's a poor programmer. "Success" must be mighty loosely defined here.


I was all curious to read about monetarily poor programmers. Instead, this. I agree with part of it. The goddamned arrays really trigger my Refactoring Legacy Code PTSD though. Any time you make an array, you need to make sure that damn thing is big enough. Do you even know in advance how big that bastard needs to be? You could make it just plenty big, I suppose, like an asshole. Just allocate 10,000 4-byte blocks like that shit just grows on trees. Then change it to 15,000 the first time someone has 10,001 things and crashes your shitty shit. Also any time you look at your shit you're gonna need a goddamned shitty index. A whole 'nuther variable! Get some generics ya dufus!


Former monetarily poor[0] programmer here. AMA

[0] $7k per annum take-home-pay.


> If you are bad at programming, you are still programming, something that very few people can do.

That's heartening. I bet a very bad doctor or lawyer can still do some good somewhere too, as long as they don't convince themselves they are better than they are.


I think the best way to make it as a 'poor programmer' is be a domain specific 'poor programmer'. I am bad a nearly all my programming in all my projects but probably just as efficient at getting jobs done than a 'good programmer' because the feedback loop is fast. I can hack away until something works and then sometimes polish it off when it functions. I suggest work using really popular tools and there is usually a solution written up online easily found through Google. Such are the times.


Please don’t store everything in Arrays. If your language supports tuples, use them. If it doesn’t, at least use a struct if possible. Basically: Constrain the possible input and output space of your functions to the smallest amount of possible values. That way, you are reducing the room for errors significantly. Using arrays everywhere, many errors might even go unnoticed until an edge case occurs.


A very interesting post, since it poses one very conservative, tried and true approach to programming productively. I know a lot of programmers tend to arrive at those very same conclusions in their career. That said, I think the article and most of the commenters here don't necessarily conflict with each other; every piece of advice can apply or not, depending on the circumstances.

On a larger scale, it's remarkable that a lot of the general programming advice given today is more of a heuristic/guide than a tautology, but many people still assert that their advice is The One Right Way To Do Things. It's important to understand that the vast majority of all this advice comes from real examples of what worked and what didn't. So, perhaps the best thing one can do as a programmer, in any domain, with any technology, whether you're a great programmer or a poor programmer, is to listen to it all with zen and a grain of salt.


I dislike this mindset too. For any other profession it would be crazy, but for programming it's fine? What happened to apprentinceship and mastering the craft? Why is it fine to be considered a fool if you are not an expert programmer from the start? Should a blacksmith never make a sword, because at the beginning they can only make nails?


This guy is poor because he is lazy as hell. The most important thing about programming is constantly learning new ideas, tools and paradigms, solving new problems, improving as a person and learning as much as possible. If you don't do these, don't like to think or solve problems, you will stay indeed poor.


Kids, trust your elders on this one, the author is correct.


With regards to "Avoid Learning Anything New" I would be very surprised if Pete Shirley follows that advice when it comes to things actually relevant to his job. People who avoid learning anything new don't get to be senior researchers at Nvidia or have the following publication list: https://scholar.google.com/citations?hl=en&user=nHx9IgYAAAAJ...

I guess what he's trying to say is Avoid Learning Anything New if it's ancillary to your job, and instead focus on your core competences.


> Finally, let the computer do the work; Dave Kirk talks about the elegance of brute force (I don't know if it original with him). This is a cousin of KISS. Hardware is fast. Software is hard.

Yeah, I don’t know. The cloud bill is going to be expensive.


It is also terrible advice in regards to Energy consumption


One problem I have with 1. and 2. is that they tend to escalate.

Example: "We're doing a combo-box component, but to keep it simple let's not support multiselect."

Retrofitting such a feature when You Eventually Do Need It(YEDNI?) is neither pleasant nor simple.

My take is that one's skill level is not the most relevant thing - we have code reviews to deal with exactly this problem.

What ultimately matters is whether you're adding or subtracting value.

I've worked with people who were aggressively incompetent. As in: they had bad ideas and were insistent on implementing them, even going as far as bypassing the regular review cycle.


ALAN (avoid learning anything new) is bad advice if taken literally.

Learning fundamental knowledge such as programming paradigms, data structures, algorithms is something that will likely not become obsolete in a year by year basis. You should totally learn this if you have the time.

Now, memorizing every API in a framework that is likely going to change in 6 months, is probably not going to be very useful in 5 years (but it can be beneficial to achieve your short-term goals and move your career forward).

It's not about "avoid learning anything new", it's about being tactical about what to learn.


std::vector everything!! TBH I feel anything beyond a simple array/vector/stack/BST/queue is pretty advanced and should only be touched by advanced programmers...


"Programming is rather thankless. You see your works become replaced by superior ones in a year. Unable to run at all in a few more." ~ why the lucky stiff


This case is making the case for a specialist, it's just worded poorly. There's a case to be made for such folks, there's a case to be made for generalists... and if you're super smart with tons of memory you can be a T shaped person. It doesn't matter which path you choose, you can thrive in any of them.


I wished he just used the word 'bad' instead of 'poor'. I thought this is about cash-strapped programmers.


ALAN might be a good advice, but software is such a miserable place unless you can learn and apply new stuff. My boredom would get me if I would do things the same way every time.

UPD: just realized who the author is. He _does_ learn a lot of new things, just not in software development, because it's not the focus of his career.


Rather than Avoid Learning Anything New, I'd say avoid implementing anything new. Options always come in handy, you don't need to implement every new tech you learn about, but knowing it exists can make the difference between an impossible feature and a possible one


There is no formula to succeed in anything. The more people know something the less it values. It is just random, accept it. Just a tech flavor of random. You can always be a salesman but you can't be a good salesman and a good scientist.


I wonder if the author has considered the idea that it's not a lack of effort or some inherent cognitive gap that keeps them from being "good" programmers, but perhaps these beliefs instead?


I think it’s time. Peter Shirley decided to specialize in computer graphics, leaving the programming to people who would implement his ideas in libraries, and the production developers who use those libraries.


Interesting, but does that change anything though?

I had a famous cryptographer professor. He refused to learn anything past Pascal. He freely admitted he was no longer a programmer and that he shouldn't be doing it.

That seems like a different message from the messages presented here.


I didn’t realize this was Pete Shirley’s blog. Isn’t he still a professor?


Most of this seems like good advice for all progreammer.

Except for the ALAN thing. I think that should be modified with a few qualifications. Don't learn anything you will ever use less than 10 times or something.


If "Be aware that most coding advice is bad." is true, then how confident should I be that this "How to Succeed as a Poor Programmer" coding advice is good? ;)


Yes.


>>I discuss how to be _an asset to an organization_ when you are not a good programmer.

Sounds like it's advice on how to sandbox yourself in the interest of the org.


I assumed it was satire


Diligent practice?


Brutally true :)


i love this. this is a good programmer. sure to piss off a lot of people on here who think that because they have big salaries they are good at something.


> 4. Make arrays your goto data structure.

The others are arguable. But, in 2019, this is flat out bad advice.

Your "go to" data structures should be hash tables about 70% of the time and vectors about 30% of the time. In 2019, memory and CPU are so stupidly abundant that the abstraction costs nothing in 99.9% of all cases. The programmer gain for not having allocation, dereference, fencepost, and invalidation errors is enormous.

But, then, this is hardly surprising advice from someone who only learned about a "scripting language" in 2015. The rest of us realized that those silly "scripting languages" were better than C++ for 90+% of our problems way back in 1995.

And anyone who has used a "scripting language" realizes extremely quickly just how stupidly useful hash tables are.


> Your "go to" data structures should be hash tables about 70% of the time and vectors about 30% of the time.

I literally just wrote a comment elsewhere about how hash tables are obscenely overused and cause measurable performance degradation in many situations.


That may be.

However, the number of times I see people hit a bug because they fenceposted or flat out overflowed a fixed size array VASTLY outnumbers the times I have seen people have to redo their underlying data structure because it just wasn't fast enough.


We're just in different fields. The OP of the post was a graphics engineer, as am I. We drink to celebrate when we shave off a millisecond haha.


I've shaved 10 microseconds in large refactors and been told that's great ship it. Other fields might say that wasn't nearly enough to justify the code churn (performance vs. "programmer productivity" argument). A millisecond is a lifetime and can mean the difference between shipping and not shipping in some products.


Odd that we feel differently about hash tables then.


I didn't mention how I feel about hash tables (I'm not OP), but I think we feel the same? I definitely agree that arrays (potentially vector style growable arrays) should be a programmer's go to data structure, both for simplicity and performance. For my use cases I'd probably swap array and hash table usage with OP (70% array and 30% hash) but really it's probably more like 90, 10. Computing hash functions and resolving collisions can be very fast with the right hash table implementation, but it's absolutely not free. There is a reason why Lua tables can implement array like access and memory usage, even if from a language standpoint a Lua table looks like a hash table with some fancy features.


Ah didn't realize you weren't OP. I think any programmer that operates in the realms of microseconds probably would understand the overhead of both the hash function overhead as well as the coherency issue. I'm not a lua programmer but I remember studying the LuaJIT source pretty extensively.


> Your "go to" data structures should be hash tables about 70% of the time and vectors about 30% of the time.

Depends on your program. In my world, it's vectors about 99% of the time when it's not a fixed-size array.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: