Hacker News new | past | comments | ask | show | jobs | submit login
Becoming a Better Software Developer: A Handbook on Personal Performance (7pace.com)
360 points by encorekt on Sept 20, 2018 | hide | past | favorite | 73 comments



Man, I thought this was going to be an article on better code reviews and how to get better feedback from people you work with regarding your performance.

Instead the article is some strange "self-help"/"hustle-hard" blogspam, quoting Malcolm Gladwell and shading a picture of the normal distribution and pretending it's arrived at some brilliant insight.


I'm all for having THAT discussion. Things that come to mind -

- treating PRs as communication and ensuring the person reviewing has the information they need to check for what you coded.

- Building a culture where discussion around alternate ways of doing things ('this doesn't block merge, but...') is accepted and expected (without becoming hostile/nit-picky or devolving into bikeshedding -- if it's not blocking, the submitter can always just merge)

- Have brown bags to talk about technology in freer, non-deadline-constrained setting (sparks ideas, gets people building tech communication skills)

Really interested to hear what thoughts others have on this.


> treating PRs as communication

I kind of wish that there was a type of thing like a PR, but marked in a way where you couldn't actually merge it. It'd still get built by CI, if such a thing was configured; but the point of the submission of such a "proposal with prototype" object would be to discuss whether the design represented by the prototype implementation is a design worth going with.

The actions applicable to such objects would be "accept and close" or "reject and close." You'd be able to have several of these objects that live under a given issue (i.e. several potential solution-designs to the same problem), and—as long as the issue has at least one proposal-with-prototype object under it—the issue would be in a "pending" state until one such proposal object was Approved, and Approving one proposal object would Reject the others.

The point of this would be to replicate the thing that people go through with design discussions on mailing lists when they send in code samples to explain their designs—but with those code samples being working, buildable code, such that the properties of the design proposal can be tested against the current implementation and against any alternative proposals.

You could call these objects "RFCs" :)


My previous team had a convention of putting a prefix "[DO NOT MERGE]" on the subject line of RFC-style PRs. Worked well enough, though it would have been nice to have the tool enforce that also (even a checkbox like "prohibit merging" would have worked).


Gitlab actually enforces this for PRs with [WIP] in the title


There's a browser extension that does something similar for github too. At least for Chrome, anyways. I imagine it works for other browsers though.


Thank you for that tidbit of knowledge. Learned something tonight.


Absolutely, this. We did the same thing -- [DONT MERGE] or [WIP] in the title.

That way you could get feedback from other devs, and pushing commits to your PR would trigger CI runs so you could be made aware of any build failures you were causing. (The builds took hours, sometimes, so running the full test suite locally was impractical)

We used Github but I'm sure it would work in a lot of workflows.


This is pretty conducive to a Gerrit (https://www.gerritcodereview.com) workflow. If unfamiliar, Gerrit essentially treats each commit as it's own PR or "change" in Gerrit terms. It uses git's refspec as staging area so each commit basically becomes a mini branch. Changing a "change" then requires amending the commit and pushing a new "patch set".

So what my team does when we want to prototype something is we create a change and then push separate patch sets for different solutions (there is a nice diff tool to view differences between patch sets). The CI runs for each patch set and we then discuss the merits of each approach and "abandon" (reject) the change and use it for reference going forward.


We sometimes have this use case at work where we want to discuss a potential idea and get feedback without merging anything. What we do is open a PR and decline it directly (I'm using bitbucket lingo). This way it can't be merged, and the discussion diverges from the particulars of the diff and focuses more on the idea and the architecture of the change


I was just about to link to something like this[0] before I read your last sentence.

[0]: https://github.com/rust-lang/rfcs/pull/911


Hmm. I suppose your CI and code review tools allow code on branches different than master?

So you can directly use them to show a what-if PR. I did; it worked, that is, sparked a discussion and led us to designing things in a better way.


Yeah, I was picturing GitLab's CI/CD workflow here, where the RFC objects would get Review Apps built from them.

My objection to using PRs for this is that people think the point of a PR is to merge the code, and people tend to nitpick the code during code-review with the goal of making it clean enough to merge.

The idea of a separate RFC object is that, unlike a PR, you literally cannot merge an RFC, so there's no temptation to nitpick, or really to talk about anything other than the design. It much more closely mimics the social mores of a mailing-list thread discussing a code snippet.

Also, being able to explicitly track Approved and Rejected RFCs on a system level would be nice, to know what discussion needs to be referenced when doing the final implementation. If you just used "what-if" PRs, both the chosen and not-chosen designs' PRs would just end up in the Closed state, and would show up equally in search. Ideally, Rejected RFCs would be filtered out of search by default.


Interesting idea. We already sort of have a special-case MR (WIP) where merge is disabled until the work-in-progress status is removed. It could be interesting to have an RFC equivalent where no merge is clearly ever intended, but you still get the conversation flow, review app, and so on.


Definitely might be useful for teams/projects organized enough to have an RFC process.


    Building a culture where discussion around alternate 
    ways of doing things ('this doesn't block merge, but...') 
    is accepted and expected (without becoming hostile/nit-picky 
    or devolving into bikeshedding -- if it's not blocking, 
    the submitter can always just merge)
Yeah, this is IMO an important thing you want to do in code reviews. Specifically, when it's part of an ongoing collaboration and the feedback can be put to use in subsequent reviews.

We had soooo much blocking nitpicking and bikeshedding at my last job.

The "code artistes" among us would block PRs for these sorts of debatable style issues and other nonessential issues that weren't even remotely blockers IMO. Those discussions were a real sap on productivity and team cohesion. And management was unwilling to give any direction.

(It was a particularly big problem on our team because our test suite was a real pig, and moving code through the build/test servers and out to production could take hours sometimes -- so highly debatable nitpicks could result in literal days of lost time)

When I wanted to give that sort of non-blocking constructive feedback, I always simply did what you mention: I left the feedback, discussed things with the submitter, and approved the PR. Not rocket science. Although, apparently, it was beyond some of our devs' comprehension.


One PR does not a code base ruin. Never block a PR unless it's a customer facing issue. If you're getting into a "thin edge of the wedge" situation, or you are having bad actors on your team, address that issue separately. If you need the leverage of blocking a PR to force the conversation, then you have already lost all hope in that team. Find another place to work (for both groups peace of mind).

Like you said, some people can't comprehend that. However, it's exactly the same "bad actor" situation. If people are executing a denial of service attack on your process in order to get their own way, then you need to address that situation -- outside the context of the PR. If you can't solve the problem, then it's probably time to consider voting with your feet. Working with bullies is never going to be fun.


I've been wondering if it would be useful to have tools that help build better narratives for proposed changes. Which files should be viewed in which order? How do the changes tie together?

It's not quite literate programming - the commentary would have to be interwoven with the code, but it's not permanently attached to it.

Existing comment systems usually approach the interwoven change and commentary alright by interleaving comments into the diff, but they still don't allow you to choose which files or diff chunks are displayed in which order.


Thanks for sharing this idea! We have a similar issue regarding this [1], so please feel free to engage in the discussion there and bring some attention to it. We'd love to hear more from you on this matter!

[1] - https://gitlab.com/gitlab-org/gitlab-ce/issues/18037


I'd never thought of that. That's brilliant. I agree: it'd be nice to somehow control the order in which the diffs were presented to the reviewers!

This would IMO be a killer feature for Github or Gitlab to implement.

Typically, I explain complex PRs "manually" in the PR itself or by screen sharing them with the other devs, but this is not always very efficient.


I want pre-PRs. Call it email lists, or waterfall planning, or RFCs - but I am fed up starting on tickets / work and finding out that there are 5 different POVs.

I think, in short, every business would win a lot from having a PEP-like process where business owners and coders discuss what is wanted.

So while I am wishing

It shall be illegal, punishable by a week in the stocks to

- ask for an estimate verbally for any job that has not had at least 200 words describing the requirements and been responded to with interrogating questions and explanations

- to work on any project that does not have a 2 page summary and been broken down into a minimum of 20 seperate 100 word requirements

- and a pony


Actually, I was sceptical to begin, and I agree that it is not an in-depth article, but I think the general principles outlined are good ones for someone to follow.


Yup, there's a fair bit of stuff that's intuitive to a seasoned engineer but it's great to have specific terms to talk about and describe it in. The "diffuse" vs "focused" thinking was a great example that I'm definitely going to use elsewhere.


I first heard about it in the excellent learning how to learn online course: https://www.coursera.org/learn/learning-how-to-learn


Software is not sports. Stop comparing software to sports. Stop comparing developers to athletes. Stop making analogies to running. Just stop.

In fact, please stop doing that for all serious vocations, life is not a game and that's all sports are: a game. That's why everything in sports is super simple, straightforward, and measurable [except when it isn't, but don't mind me].

Sports are not really a good analogy for most things, because sports are entirely relative and have no fundamental value. No soccer player has any inherent value except for being better than a large amount of other soccer players. So it makes sense that soccer player skill follows a binomial distribution since it's basically that by definition.

This is not the case for developers. Developers do not need to be relatively best to produce a lot of value, and mostly produce value by the nature of, well, producing it. If you create a useful website, you created a useful website, it really doesn't matter if you are the best website creator on the planet. The value of that website has no relationship to your abilities relative to others, and has much more of a relationship to things like whether it's useful for customers.

And since there are a range of domains, you actually end up with balderdash if you measure developers against each other, since some are good at debugging, some at C, some at websites, some at arguing with product owners, and some at writing AIs in Python. There's really no point or benefit from trying to homogenize them all into a group and rank and file them.

What we need is not "great" developers, what we need is higher standards for all developers, so that across all these varied domains we have lots of people who respect quality.

The "fundamental gap" between developers is likely not there and I think this thinking is actively harmful. All we need is to encourage more developers to care, which is already a hard enough problem without getting into the useless weeds of being the developer equivalent of Muhammad Ali, which seems like a great way to discourage literally 99% of developers because that is how sports work in practice and most sane people avoid sports [as a job] for that reason.

Rant over.


Yup. I think there's actually some good ideas in here, but then it's surrounded by ridiculous conclusions. For instance, I think deliberate practice is a good thing; you can quibble on the definition of that but fundamentally its true that people often plateau in their skills because they get comfortable and lack a growth mindset.

That's great, but then it says that in order to improve you need to declare a metric so you have something to measure, and then work to improve that. I don't even know where to begin with this; Goodhart's Law is the obvious response. But even more fundamentally than that, the thing that makes programming hard is entirely the qualitative aspects. Every single choice you make in software is a tradeoff with subtle implications, good software engineering is about understanding these tradeoffs in a given context over a period of time, there is absolutely no way you are going to reduce this to any kind of metric without throwing the baby and all its siblings out with the bathwater.


> That's great, but then it says that in order to improve you need to declare a metric so you have something to measure, and then work to improve that.

You don't have to keep the metric same. Metrics are good when they helps you to measure your progress in particular tasks. They are bad if you marry specific ones for life. If you worry about throwing the baby, setup counter-metric.


> The "fundamental gap" between developers is likely not there

Even at top tech companies l I’d say 80% of the value was added by 20% of the staff.

I’d argue it is clearly present and far more pronounced than in sports.

The key difference as you have alluded is software development is not a winner takes all competition.


> Even at top tech companies l I’d say 80% of the value was added by 20% of the staff.

That may be true but it doesn't relate to my point as much as you might think. I believe the main reason most people don't contribute in a given organization is because the organization creates little incentive to do so, so mostly people are running off of their own motivation.

What I mean is, people contributing to most of the value are not such because they were born talented programmers or ground their nose like crazy, they mostly just actually care and invest into the subject.


This seems like an article that would not have been on HN's Front Page a few years ago. Are people blindly upvoting articles based on the title? How can HN moderate incredibly low quality articles like this one? This is a checklist of how to become a better anything, how to more effectively work on a team and other work how to's without diving beneath the obvious surface at all.

Am I the only person seeing HN descend into terrible click bait with flame war comment sections? In this case, I really hope my observation is the outlier, and that people continue to find this site helpful and contructive.


This type of article has been about 5% of the front page for as long as I’ve been in the community (~4 years, lurker for ~3).

Yes, maybe it feels obvious. Yes, it’s doesn’t cite a whole lot of evidence. But it resonates with people, and being reminded of the obvious continues to be useful if not glamorous.

So I’ll keep upvoting those that resonate with me!


As long as the resulting discussions are sufficiently deep, I don't mind if the subject under discussion starts off shallow.


> The well-known 10,000-hour principle, popularized by Malcolm Gladwell, illustrates an important lesson for developers. ...

It’s not often I’ll wholesale dismiss an article on HN because of one sentence, but this is one of those times. The fabulist Gladwell is not credible in general, and his 10,000 hour “rule” can no longer be thought to even remotely approach science, or even fact. The author lost all credibility with this sentence.


I think it's an interesting look at the spread of bad information. When the concept of "everyone is 10,000 hours from being an expert" first surfaced, it was spread far and wide. As time went on and more evidence came out that this wasn't accurate, the NOT of the concept wasn't as interesting, so it wasn't spread nearly as much.

Basically, I don't blame people for still thinking it's true or a good rule of thumb. It's usually just used to say "if you want to be good at something, you have to practice", which is true.


Honestly, this makes it worse for me. Practice makes perfect was not exactly a novel concept before Gladwell. The phrase itself has been popular since before the US Civil War:

https://books.google.com/ngrams/graph?content=practice+makes...

I mainly blame Gladwell for this, of course, but people repeating this are definitely getting side-eye from me.


> Practice makes perfect...

"Hmm, sounds too cliche"

> The well-known 10,000-hour principle, popularized by Malcolm Gladwell, illustrates an important lesson..

"Much better, I've used more words"


When this was discredited, was there alternative research on how people actually accrue skill, or was this just a dismissal of a claim and no alternate information provided? It's a shame if this kind of research is not being done, but obviously, it's understandable that a catch-all "how much experience does any skill need to build for anyone?" is not easily, if at all, answerable.


I only read that Gladwell overemphasized the specific number but that the overall idea is still true. The original research Ericsson was trying to emphasize that long duration of training using deliberate practice are key.

There's an episode of the Freakonomics podcast that describe the whole situation and it has interviews with both Gladwell and Ericsson: http://freakonomics.com/podcast/peak/

> ERICSSON: Now, right. Gladwell basically thought that was kind of an interesting magical number and suggested that the key here is to reach that 10,000 hours. I think he’s really done something very important, helping people see the necessity of this extended training period before you reach high levels of performance. But I think there’s really nothing magical about 10,000 hours. Just the amount of experience performing may in fact have very limited chances to improve your performance. The key seems to be that deliberate practice, where you’re actually working on improving your own performance — that is the key process, and that’s what you need to try to maximize.


Here's the key thing that I think has made me a better developer - focus on being a better communicator. At some stage in your career you're going to work on something that would be impossible for a lone dev to write. At that point being someone who is happy to have a meeting, write a clear document, or answer the phone is going to make you a key person on your team. You'll be in the center of everything, hearing the facts and making the decisions. That's what senior devs do. They communicate.


I firmly believe that every developer should take a technical writing course, even if they are self taught. Only my high school typing class surpasses my university technical writing course with regard to the skills that I consciously apply to my job.


It never ceases to amaze me how many engineers I work with struggle to write a competent email. This is a vitally important skill when you're trying to communicate specifications for a project or bugs and changes to code and hardware designs.

I'm not perfect by any means. I've had my share of confusing interactions (particularly when explaining technical things to peers in meetings), but at least I can proofread with a critical eye and fix confusing grammar and explanations.


I have been collecting similar guides to improve on software engineering, as I am a recent graduate looking to excel in my career. Here are some other blogs I found useful:

https://recurse.henrystanley.com/post/better/

"Ask HN: What topics/subjects are worth learning for a new software engineer?": https://news.ycombinator.com/item?id=18000410


My favorite takeaway from this is that I'm an ANGELIC TROUBLEMAKER (because much/most code is demonstrably terrible but I do have a fair amount of experience in greatly simplifying it), but other than that I find several points in the article to only be relevant in these times.

I've been through several boom/bust cycles and right now we're living in a boom period where:

* Programming demands a high hourly rate

* There are many programmers on the market, tending towards making them interchangeable cogs

* Income inequality is high, leading to a wide discrepancy between wealth and technical literacy

After the coming web2.0/mobile bust in a few years, these will likely no longer be true. I think what will make a good programmer then is the ability to map use cases (design) to abstractions and then to business logic in as little code as possible (preferably none at all). We're going to go back to a time more like the 1980s, when nontechnical people were able to get Real Work done with spreadsheets and DBMSs like Microsoft Access.

I looked at Airtable a bit and it has some interesting ideas but is no panacea. There are other interesting attempts in that space but almost everyone seems to be moving in the opposite direction towards large volumes of highly imperative, object oriented code which is prone to technical debt that requires a longterm support staff of technicians to maintain it. In other words maintaining is currently more lucrative than architecting, but that wasn't always true, and will likely not be true again in a few years.


I don't think this ever stopped. I still migrate company's access and excel sheets to the cloud, and many of these 'apps' were built in the last decade. But eventually corporations outgrow these technologies when they bump up against the technologies' limits. It's just a lot cheaper to migrate to and maintain a web app than it was in the 80's so they migrate earlier now than they used to.


may as well lead in with truthful sentence: "For this article, we reached out to every kind of role except actual software developers"


I have three rules for better coding performance, for me.

- Ask Questions

- Do not Multitask. One problem at a time.

- Test control answers i.e. You want result to be 10 then test for 9 and 11. Wrong results create a more stable solution.

Speed has never been my strong suit, but when I deliver code to QA, you can bet that you will be testing a well thought out solution.


Highlights of this article:

- 10,000 hours mastery: Another power-of-10 based bullshit stolen from Rocky I.

- Stacked ranking: Bullshit belief that gives you 100% probability of ruining a company. Costed Microsoft 10 years to fix the damage caused by it.

- Personality types: Horoscope-grade pseudoscientific bullshit used to justify bad management. See also: MBTI

- Hyperbolic decay: Often applied to impress people easily impressed with math, i.e.: bullshit

- Self-promotion

My thoughts and prayers to the 278+ people that upvoted this article.


The most valuable skills I've learned are empathy and a servant attitude. I cringe at some of the things I've said in the past with an inflated ego and poor attitude. I try to put relationships first and listen before problem solving.


Despite the weird formatting and general buzzwordiness I actually enjoyed this quite a bit.

Chapter 2 resonated the most, finding a meaningful metric and then measuring against that is something every engineer should do, not just in SW, but sadly it's very common that teams just bumble along hoping for the best.


Was hoping for more concrete insights. Things only developers with decades of experience would know.


Here's an insight from a developer with 15+ years of experience: be very, very careful who you take advice from. The so called "best and brightest" in our industry have led us down the path we're at today. I have a supercomputer in my pocket compared to computing speeds in 2000, but it can't even smoothly scroll down a webpage. Object Oriented programming alone has derailed progress in computing by at least 20 years.


Thank you for this - Can you clarify why OO derailed us? But if that explanation might be a long story haha so what was before OO, so I can do my own research


This video does a great job of explaining why modern programming in general has failed: https://www.youtube.com/watch?v=uZgbKrDEzAs

I think OOP specifically derailed programming because of how big it was, how fast it took over and how long it was considered to be the one true programming paradigm. When OOP hit, it hit hard and fast. It wasn't long before colleges were teaching OOP as The One True Way To Program. Every employer required knowledge of OOP before they would interview you. And I mean every employer, from startups to corporate enterprise. And once it took hold, it took people (and me!) 10-20 years to realize the OOP emperor had no clothes on.

I still don't understand why or how it got so big so quickly. All I can figure is that the software industry as a whole pays attention to the wrong people.


Because it led us down a very fulfilling path of inventing interesting abstractions to solve problems, which tickled our mental fancy but led to bloated, sub-optimal solutions.

Unfortunately, it worked ... but eventually the bloat caused projects to hit a wall of unmaintainability. (If it had not worked so well in the medium-term, we would have moved past it much sooner).

Another reason it derailed us is simply the level of buy-in it received from the entire industry. Everyone drank the cool-aid, so thought advancement in more powerful approaches -- such as functional programming, data structure-oriented programming and even data oriented programming -- was neglected.


This (very long) article explains why OO might not be such a good idea as initially believed.

http://www.smashcompany.com/technology/object-oriented-progr...


Another tidbit ... imagine how science might feel if string theory is shown via undeniable objective evidence to be completely wrong. "All that time was wasted...."

But the software world bought in to OOP much more deeply than science has bought into string theory.


God, what a badly formatted article. Fonts, line spacing, paragraphs. Everything is wrong. It's unreadable...


Be warned! There is no correlation between being a great software engineer and having a rewarding career in the workplace. You need other skills to survive the politics in hiring and promoting as well[0].

0 - https://cis.org/North/Oracle-Sued-White-Males-Indians-Receiv...


Oracle isn't a good example. Newer companies pair career development with individual developer impact.


Unfortunately you need half of this. Just listen and understand what other say don't go over it. Don't think you have it all right. Work hard keep your head down, but look for the options.

There is a lot of stuff to say to go kill it, but not. To kill it you have to be exceptional and usually people are not exceptional. Unfortunate is most people will think the6 are exceptional and fail.


This is an excellent general framework for upping your game in software development within the context of a decent sized software engineering organization. I've stumbled my way into much of this advice over the past 2-3 years.

One caveat is that much of the advice is predicated on having a supportive environment.

For Example, > Application development is a team sport. Period. Full stop.

What if you work in an environment where people primarily work on their own? I have done a lot of that in the past.

> For many, the best way to learn is to teach.

What if there aren't feasible opportunities to teach or your management pushes a pace that makes it not plausible?

> Create side projects

There are managers that support the idea of doing side project but not the execution.

I'm sure you can find more examples of suggestions that aren't plausible in your environment.

Here is my answer to these examples, I wasn't just complaining. If you don't have opportunities to fulfill sections of this framework that you feel you are lacking in, do it anyway. If your manager tells you not to spend time learning and improving ignore them. Do it anyway.

You might say, "that's fine for you Mr. internet commenter, but I'm the one that has to take the risk". Well, that's true but I've already done it. Here were my results.

Did exactly what my management asked me to do for two straight years. All reviews for two years were "Meets expectations" with a long list of things I should improve.

One year ago, I read a book with an abhorrent title but good information "stealing the corner office"

I stopped listening to what my manager told me to do and instead I picked up random side projects with little to no perceived business value for my team, spent 10x more time reading code and made commits in projects I shouldn't be working on, and started spending a lot more time understanding basic fundamentals of software engineering.

Have had "exceeds expectations" on every review since and glowing praise.

It's counter-intuitive but, don't give the people what they ask for, give them what they want!

Note: I got a bit carried away on this comment, I should have mentioned that in reality I spend 10 - 20 % of my time working rogue, and the rest of my time doing what my boss tells me.


I’m pretty sure ignoring what my manager told me to do would have gotten me fired relatively quickly at every job I’ve ever had.


haha ya, my comment isn't full fleshed out. I think I got too excited. I should have added that I spend about 10% of my time working rogue which is much different than what I wrote.


That is a little different. ;) Is that 10% in addition to the 100% you were spending before, or are you spending 90% of your time doing what you’re told and 10% on other things? I’m thinking here of stuff like Googlers referring to 20% time as 120% time, and such.


I know many people that take the %120 approach, but I take the 80/20 approach. I have a lot of kids and volunteer a lot outside of work so I don't want to make time for 120.


If you don't mind me asking, what were the responses to you committing on projects without asking/authorizing, and what visible change did you make with your better understanding of the fundamentals? Do you have a book suggestion for SE fundamentals?


Adding to projects without authorization, is extremely popular all around, if the commit/pr is useful. My boss sees it as taking initiative, helping others, and broadening my skill set. Other teams see it as getting free work.

Even when my contributions suck people think of it as a positive and often teach me the right to do the code or explain why the change is not necessary.

As far as books go I don't have anything to recommend. For me improving fundamentals was really about figuring out what I didn't understand and diving in to learn it. I prefer online materials over books but one book I have read that I thought was great is 'the pragmattic programmer'. https://www.amazon.com/Pragmatic-Programmer-Journeyman-Maste...


Awesome, congrats on your success and thanks for the suggestion!


Thanks, it can take a little while for the fruit of this approach to show up. And it doesn’t work in all organizations but I think it’s a great idea for a lot of software engineers.


I don't think it's counter-intuitive at all! Here's what I'm interpreting from your comment:

* When you listened to expectations and met those expectations, you were reviewed as meeting expectations

* When you met expectations PLUS did additional stuff, such as "spending a lot more time understanding basic fundamentals of software engineering", you exceeded expectations.

Sounds perfectly reasonable to me! Especially since becoming better at SE fundamentals probably made you an objectively better programmer.


This is a great point, except when I was meeting expectations. I talked to my manager a lot about I how could exceed expectations and followed that advice then I was still told I meet expectations.

I think it's often difficult for managers to describe how to exceed their expectations, it may be a natural human limitation.


Orthogonal content that inspired me to finally get serious about learning functional programming, and led me to Erlang: http://sijinjoseph.com/programmer-competency-matrix/

Sure, it's flawed, but isn't everything?


In my experience the main determinants of programmer productivity are innate thinking ability and motivation.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: