
How Developers Stop Learning: Rise of the Expert Beginner (2013) - spdionis
http://www.daedtech.com/how-developers-stop-learning-rise-of-the-expert-beginner
======
eranation
If you work in an average company who hires average people and you are well
above average you quickly risk becoming exactly that. You can be the top 5% (I
know it's subjective and controversial yet I'm sure we all have a clue what
that means) of developers and this will make it likely for you to be that
"local maximum" as everyone will see you internally as a guru.

But at companies who are employing the top 5% of the top 5% you will probably
won't even make the first round of phone screens.

Get out of your comfort zone, and simply learn something new. Go wide, go deep
or go both just realize that the moment you feel you are an expert on
something it just means you are probably just ready to start learning it for
real.

tl;dr Knowing more than your peers doesn't make you an expert. You are at a
local maximum and you need to look at the big picture to avoid being stuck
there forever

~~~
baldfat
> If you work in an average company who hires average people and you are well
> above average you quickly risk becoming exactly that.

I liked Steven Covey's 7 Habits take on this much more "Sharpen the Saw." This
approach and wording puts all the blame on those around you. That is very good
to think your above those around you and that you are above average.

A) Nothing wrong with average half the people are below average

B) Rating someone top 5% is totally a perception unless your in something that
can be measured easily (Think fast 100m Dash or the author's bowling)

C) Having hired dozens of people for different jobs I can tell you that my top
5% at higher usually ended up being drama and the one person I just hired
because I needed a body ASAP ended up being hands down my best employee. (Look
at how Google has changed their hiring practices because hiring is close to
impossible to get right, think NCAA Brackets)

D) Self-deception - Everyone believes their kid is above average and self-
delusion inflates our own self-assessment. Blaming those around you give you a
great out of self-responsibility.

~~~
dsp1234
_A) Nothing wrong with average half the people are below average_

This isn't necessarily true. Imagine a system where we could rate developers.
There are 4 'A' developers (92,95,97,91) and 2 'B' developers (83, 85). Only
1/3 of the developers are 'below average' as compared to below median, and
flip those numbers around and 2/3 are below average. This is kind of important
when evaluating the type of company culture that you're in. Is it a place
where there are a lot of 'top 5%' developers, and a short tail of good, but
not great developers. Or is it a place where there are just one or two
developers that doing 'A' work, but a long tail of developers doing solid, but
not great work. The 'average' could be misleading in these cases.

~~~
baldfat
Wow I have a Theology/Philosophy degree and I have seen what I can
Theology/Philosophy gymnastics to make some wierd basic fact of life into a
totally different thing. You might have missed your calling.

Average = Average you have 100 developers 49 and below are bellow average :P

~~~
ajuc
What is the average of [1, 1, 1, 1, 1, 1, 1, 2, 2, 10]? How many entries are
below the average?

~~~
baldfat
No: Dealing with 10,000 People

We are talking rank so top person ranked 10,000 and the bottom is 1. 10,000 -
5,000 are Above Average and the 4,999 to 1 are Bellow Average.

What you showed was some abstract numbers that have nothing to do with
ranking.

~~~
ajuc
Nobody said we were using ranking. And we shouldn't, because ranking is
misleading (it loses information by hiding the size of the gaps).

But even assuming it's a ranking - in a ranking there's possibility of equal
rankings, right?

So, you can have rankings [1, 1, 1, 1, ..., 1, 10000], and the
"average=median" assumption is violated again.

Most real-world distributions have different average and median values, the
"50% is below average" is a common misconception that should be cleared out.

For one important example see average salary - huge majority of people in
(almost) every country earn less than average salary in that country.

~~~
baldfat
How is this NOT ranking. You are using the logical fallacy of Reification (the
turning of something abstract into a concrete thing or object)
[https://en.wikipedia.org/wiki/Reification_%28fallacy%29](https://en.wikipedia.org/wiki/Reification_%28fallacy%29).
What __real world example __at all works with numbers like you are saying "[1,
1, 1, 1, ..., 1, 10000]". When you are talking about specific people with
specific skills you can't go abstract. Average when talking a population group
in skills is ranking just like in Standardized Testing. Yes half the children
in the America are bellow average and the larger the number the more this is
true. Likewise the lower the number the less this is true. We are talking tens
of thousands of people.

EDIT: Oh I think you looking at this like average house hold income with Bill
Gates throwing off the average possibly? Well we are talking about only one
thing and that is the skills of a programmer compared to the skills fo other
programmers. Just the same as the average math skills of children. This is why
we have percentiles and my argument still holds on. Skills are not like money
where one person has $60,000,000,000 and another has $100.

~~~
ajuc
Call it "median developer" and I'm OK with that. Average is a different thing.
Median developer is by definition a developer that is better or equal to 50%
of developers.

> the turning of something abstract into a concrete thing or object

We're talking abstractions from the start. The only abstraction all my posts
here rely on is the assumption that skill level is 1-dimensional. And you
started using that assumption (by referring to average developer).

Real world example:

There are 10000 programmers. Many of the programmers that are just starting to
learn have similar skills to each other. Some of them work for a few years
already. Their skill levels start to diverge. Many of them never develop their
skills too much (because they never need to and aren't self-motivated enough).
Some of the programmers got lucky, work in places that allow/require them to
learn forever, and become very, very good. There are few of them, and their
skills vary widely.

There are tons of people near the bottom, and very few people near the top.
It's natural consequence of the amount of work and focus it takes to move from
the bottom to the top.

Ranking them hides some of the differences, but not all (because it's much
more common for people to have the same rank near the low end of the spectrum,
where there is much more of them, than near the top - where there is just a
few people separated by big gaps). So, when you took ranks from these people
you would get sth like below (1=worst, 10000=best, if your ranking works the
other way - reverse it it still works the same)

[1 (repeated 20 times), 2 (repeated 15 times), 3 (repeated 10 times),
4(repeated 5 times), ..., 100, ..., 10000] - so the distribution would still
be skewed and the average would still be different from the median value.

~~~
baldfat
Average in statistics contains the following mean, median, mode, range and
more. This (Not trying to be mean or saying you are trying to do this) pseudo-
intelligent attack on average because they heard a cool attack on mean when it
comes to income inequality isn't pertinent to speaking about above and bellow
average.

Real world example: It is still an non-measurable skill set.

> so the distribution would still be skewed and the average would still be
> different from the median value

You also mixing up vocabulary your still taking average = mean BUT average
also means median.

Here is real world ranking of skills (Math, Science, English and Reading)
[http://blog.prepscholar.com/act-percentiles-and-score-
rankin...](http://blog.prepscholar.com/act-percentiles-and-score-rankings)

Once again what scores are you comparing? When I went to Graduate School I had
to do GRE and MAT test and I needed to be over 70% percentile on the test just
to be average in my field of study (Theology (highest of all fields of study
while Education had one of the lowest scores which makes me laugh) This had
HUGE implications on my later PhD program acceptance. I couldn't say I was
above average in my field because I scored really high. I had to be better
then well over 75% of the people taking the test in my field of study. Just
because there are a ton of beginners that start and leave that doesn't mean
that there isn't a middle and that middle ground will have great skills above
the learners. I counter the argument on this paper discounting average
developers and placing the blame on their co-workers and environment. Average
workers are not to be looked down on.

~~~
ajuc
To even speak about any average you have to have total order defined. And also
addition and division (if you speak about arithmetic one).

What is the average fruit from a set of 10 different apples and 3 different
oranges?

It was your assumption that skill level of 2 developers can be compared.

And it's not an useless theorethical distinction. In lay terms: most
programmers are worse than average skill level of all programmers.

I even showed you real world example. IT is growing very fast, so most people
are beginners.

I'm not trying to look down on anybody. Being better developer doesn't mean
you are a better person, and I'm not above average IMO anyway.

EDIT: I looked it up (I'm not native English speaker), and you were right
about the math terminology. In Polish average=mean=średnia. Damn false
friends. So, if you meant average as median I agree.

~~~
baldfat
No problem. I just get upset that people take a short quote and then decide to
force it on everything associated with it.

Due look at the link for ACT Test (For college admission) this kind of shows
how to make an average of skills and rank them with a population group. In
statistics you can do just about anything and it can be scary but there is
always a middle and it just depends on your decision on what is the best tool
to use.

Have a great day.

------
mcbrown
This (partly) explains a major paradox I see: all the best software engineers
I know are over 50, and yet the young'uns who dominate the tech zeitgeist
assume these same people are incompetent dinosaurs. The young'uns have reached
the expert beginner stage, and because they assume they're experts, they
refuse to seek out the actual experts to discover what it is they don't know.
And if the young'uns ARE the experts, and older developers aren't part of
their in-group, then by definition the older developers must not be experts.

This is why I (for one) am strongly biased towards hiring older developers.

~~~
Karunamon
_All the best software engineers I know are over 50, and yet the young 'uns
who dominate the tech zeitgeist assume these same people are incompetent
dinosaurs._

With age comes experience and also comes a strengthening of biases. Older
people, so the stereotype goes, are inflexible and set in their ways, if
_very_ experienced in those ways.

It's not really paradoxical, it just depends on what tradeoffs you're willing
to make as the person responsible for hiring, which depends on your line of
business. There are few things more tiresome than someone making mistakes you
already made two decades ago and tried to warn about, and the same applies to
someone ignoring the new and improved ways to get things done just because
they've always done it that way and know it the best.

~~~
maxxxxx
Actually I find a lot of the young guys pretty inflexible. They jump on the
cool thing of the day and then don't look left or right for alternatives.

~~~
nevir
Even worse, (generalization) they get frustrated and blame the team/project
when the tool-of-the-day doesn't work out.

~~~
maxxxxx
I have given this some more thought and my theory is that probably most of the
currently young devs will turn into old developers that will have stooped
learning a long time ago. I guess that's the natural progression of things.
Most people stop learning at some time. It happened to people who started 30
years ago and it will happen to the currently young hotshots too.

~~~
beat
As an over-50 programmer who likes to learn, I'm gonna talk about something
totally unrelated to programming...

I'm a musician. I've been playing guitar since I was a teenager. As a
_technician_ , I've peaked out. I will probably never play faster, cleaner, or
more complex than I do now. If anything, I'll start to go downhill as age
takes its toll on my hands. But as a _musician_ , I'm always getting better.
I'm growing more conscious, more sensitive, more subtle, more sophisticated.
I'm a better musician now than I was a year ago, and a _far_ better musician
than I was five years ago.

There's a similar thing in software. When you're still learning, still
approaching real expertise, it's easy to think that being a great software
engineer is about technique. It's not. I know a bunch of over-50 programmers.
Sure, many are basically dead in the water, but many are not, and are
constantly improving. It's not because they learn a new language, or a new
framework. It's because they learn better taste. They learn more and more what
is and is not important, how long things will take, best use of resources,
translating requirements more effectively... these things will all make you a
better engineer than writing glorified Hello Worlds in framework-of-the-month
ever will.

------
dsmithatx
Technology moves so fast that I'm not sure it even pays to become the actual
expert anymore, most of the time. When I was younger I'd work 40 hours a week
and spend at least 40 more becoming expert. It paid off and got me to the
Senior level. However, all the technology I was an expert at 20, 15, and even
less than 10 years ago is useless.

When hiring I usually looked for guys that were smart and could quickly learn
new things or jacks of all trades. Now if trying to write new kernel drivers
maybe I'd get a Kernel/C expert who values himself at 120k/year (probably
contract so I could use him for a month or two). But, lets face it most of the
time companies want to hire the 3 smart enough guys at 80k/year than 2
geniuses at one specific language or technology. At least that has been my
experience, then again, I usually work at normal smaller/midsize and non tech
large companies. I'm sure if you work at Google your experience would be very
different.

The point is I don't care about being the expert anymore as it requires a lot
of my unpaid personal time. Then after investing my time for free I find the
tech trending down and have to go back to the drawing board again.

~~~
sklogic
> Technology moves so fast

I've heard this many times before, but have not seen any evidence yet. Mind
listing the technologies that got really, totally obsolete?

~~~
sophacles
I think an even better question is: Mind listing the technologies that are
truly new?

So often I see the next great thing hype turn out to be a reinvention of an
existing concept, or a slight rearrangement of concepts that are positively
ancient. Sometimes this includes an incremental improvement over the past.

A lot of the velocity of "tech moves fast" is just backfilling tooling,
features, concepts, and edge cases into the new language or framework.

------
SeanDav
I have many years experience with all aspects of Excel, including VBA and I
understood Excel better than anyone else I knew. At one stage I had the
arrogance to think of myself as an expert in Excel. Fortunately, I changed
companies to work at a hedge fund and realized, with some shock, that I was
very far from an expert in Excel. Those guys were doing things with Excel that
made my jaw drop. I had got complacent and thought I knew everything that was
worth knowing.

The funny thing was that this was not the first time this had happened to me.
I underwent a very similar experience with C programming. I had read all the
books, thought I understood everything there was to know about the language,
then got exposed to some real C experts at a small high-tech company. That was
a humbling experience as well.

Needless to say, after getting humbled twice, I will be very careful before I
ever call myself an expert at anything in the future!

~~~
ZenoArrow
I'd be interested to know what sort of things you saw the hedge fund employees
use Excel for. I'm probably at that 'local maximum' point with Excel, and
would be interested in knowing what areas I could look into to further my
skills. The only 'growth area' I'm currently considering is Apps for Office.

~~~
daemin
So you may be interested in
[http://www.modeloff.com/](http://www.modeloff.com/) which is a competition
for using Excel to create financial models. I was made aware of when listening
to a Planet Money episode
[http://www.npr.org/sections/money/2015/02/25/389027988/episo...](http://www.npr.org/sections/money/2015/02/25/389027988/episode-606-spreadsheets)
, which I would also recommend if you haven't listened to it already.

------
lliamander
I recommend reading the whole series of posts; I am only part way through, but
it has been very enlightening.[0]

Getting out of this trap is what motivated my first two job changes. I think
there are a few things that are useful in staying on the narrow, difficult
road to true expertise[1]:

\- Engagement with the wider technology community. You need to know where the
global maxima are

\- An insistence on objective measurement and argument, rather than subjective
assessment

\- A low tolerance for cognitive dissonance or sacrifices in quality to meet a
deadline

\- A willingness to follow through and see the fruits of one's labor

\- As a corollary to the previous point, an insistence on shortening the
feedback cycle as much as possible

\- Actively working to make it safe for others to criticize you and a
willingness to say "I don't know" (in case you didn't know, criticizing others
is genuinely very frightening for most people)

\- Greater graciousness towards subordinates, less towards superiors (the
reverse of typical corporate culture)

[0] [http://www.daedtech.com/tag/expert-
beginner/](http://www.daedtech.com/tag/expert-beginner/)

[1] Not that I am an expert, just that I _know_ I'm not an expert; and that I
have met enough of both real experts and expert beginners to tell the
difference.

~~~
lliamander
Also, I have a hypothesis that Impostor Syndrome is the last defense of the
rational mind against the descent into Expert Beginnerism. It certainly saved
my bacon[0]. Listen to those doubts and seek out a mentor or a new job.

[0] In the sense of "How in the hell did the success of this project end up
resting in my hands?" or "Why am I making fundamental revisions of the
architecture?"

------
0xCMP
Published 3 years ago?? I thought he was commenting on whats going on now...

I think a great example of my issue with this is my father and I. He taught me
the basics and got me to the Adv Beginner stage which allowed me to progress
on my own. However, I chose web and he stayed with .NET for a long time. Only
after a lot of pushing he started doing web like 5-6 years ago.

I, an expert beginner at the time (and probably still), thought I was way
better than my dad. How could this guy not be better than me already? No
answers to my questions. etc. In hindsight, yea I was better but his
experience meant he learned things I never did cause I never knew I need them.
HE also knew things from his several decades of experience that let him think
through problems or only accept a certain level of quality in his work that
let him solve and deliver things much better than what I can/could.

That said, we argued a lot about things. A father and son like us can have
those debates, some times a little too heated, and be okay. But, if I was
hiring I would be cautious (not opposed) about brining on people who would be
too opposed to the way things are. The problem is that we (humans) avoid
conflict and begin to play politics. If they don't avoid the conflict then we
have open hostility. Either way, groups start forming a them/us mentality
takes hold instead of the company working as a whole. Thats basically my
general fear if I was in a position to hire. Thats what I feel "bad culture
fit" would mean.

------
supergeek133
I see this a lot in large enterprise environments. You have the "one person"
or "few people" who implemented something 10 years ago and those architectural
decisions are then re-implemented in any new project.

Because that person has knowledge of legacy and carries it over to new, they
are compensated and promoted accordingly. Meanwhile any new ideas or better
designs are sometimes dismissed because they're outside the comfort zone of
the development team that has a collective 50+ years experience in that
company.

Of course, many of us are resistant to new ideas regardless, but I see this
pattern play out most often at large organizations.

~~~
brianwawok
Which is part of why startups can be successful, right? If large cos took the
same risks and did the same things as startups, startups would not exist.

~~~
supergeek133
Completely agree, startup success is as much quick thinking and process as it
is new ideas.

------
GCA10
The "expert beginner" concept almost gets it right, but it needs a little
tweaking. The key insight is that growth stalls out when people lock into a
sub-par set of habits ... and then spend years perfecting this not-so-good
approach.

I've seen the same thing happen with journalists (oh, my God, it is chronic
and horrible there), and with chess players, musicians and all sorts of other
specialists at various crafts and creative skills. These people trap
themselves with habits that prevent them from ever being great, yet are just
marketable enough that they can earn a living for a long time.

The rot doesn't always set in exactly at the "advanced beginner" stage. It
occurs whenever people's willingness to rebuild their game disappears, or
whenever they fall under the sway of a mentor/boss who demands conformity to
B-minus work practices.

------
tunichtgut
I like this article. I think im somewhere between the "competente" and
"proficient" level learning and working with C++ and CUDA/C. As long as you
dont consider yourself "an expert", you`re good. Experts are rly rare and
otherwise you are most likely an expert-beginner stuck in the dead-end of the
fork.

Two things i may add:

1.) I think the only way out (beside getting help from others) of the expert-
beginner dead-end is through gaining more knowledge, instead of gaining more
experience.

2.) In applications, theres a massive, i call it "invasion" of "experts", just
because everyone is claiming to be an "expert", not to be sorted out by HR. HR
is most likely not a domain specialist in programming stuff. Now, given you
state in your CV, you are "proficient", but an "expert-beginner" states "hes
an expert", your kicked out of the interview stage. What a mess!

------
moomin
Just a quick plug for the wonderful Expert Beginner Twitter account.
[https://twitter.com/ExpertBeginner1](https://twitter.com/ExpertBeginner1)

------
debacle
There is a time-salary curve where it's socially acceptable to be a beginner.
If your years of experience or salary are great enough, it takes a very
honest, humble, and rare person to say "Look, I see and hear everything you're
showing me. I clearly have no idea what I'm doing. Help."

The most important part of the article is this:

> ...and for whatever reason doesn’t have much interaction with peers?

At some point you need to be shown what you don't know, and it's very rare for
that to be a self-guided learning experience.

~~~
Mvandenbergh
It's a u shaped curve though, sufficiently experienced people actually gain
reputationally by admitting lack of expertise in a particular area. Only
really works though if you're already respected for something else.

~~~
jdmichal
It's because they've stopped saying "I don't know" without following it up
with "but I'll find out and get back to you tomorrow." And then actually doing
it.

------
err4nt
If you want to break out of the 'Expert Beginner' mindset, here are some
helpful things that can challenge you to grow:

\- commit one _new_ mistake every day

\- read about a theory, concept, model, or idea you didn't know before

\- look up the history of one tool, language, or software package you rely on
and learn a little more about it

\- look up, and read at least one page in documentation for a tool you use

Making a daily habit of some or all of these tasks will force you out of your
plateau. There's no telling in which direction your learning will grow - but
it's hard to stay stagnant in your skills when continually exposed to new
ideas. So make a habit out of exposing yourself to, and generating new ideas!

------
golergka
Can someone propose a test or at least some heuristic to check if you,
yourself haven't fallen in that pit? Because this seems like something that
would be very easy to diagnose in others, but almost impossible to see in
yourself.

~~~
tunichtgut
I propose to study technical literature. Study advanced books at a university
library. If what is written inside is new to yourself AND you consider
yourself an expert, you are in the expert-beginner pit.

~~~
golergka
Well, I do study technical literature — just today finished reading lecture
slides for Stanford's CS143. The problem is, the technical literature that's
interesting to me is dealing with much harder problems that I get to solve on
day-by-day basis.

------
jkot
> _They’ve officially become Expert Beginners, and they’re ready to entrench
> themselves into some niche in an organization and collect a huge paycheck
> because no one around them, including them, realizes that they can do a lot
> better._

I think article is a bit naive. Nerds love learning for sake of it, but that
is not usual. In what way the Expert Beginner could do better? Collect even
bigger pay check? Or do less work for the same salary? Being N times better
developer does not usually bring N times more rewards.

~~~
sqeaky
I have seen this happen several times. Some, not all, programmers and IT
professionals are nerds as you describe them. In most Professional positions I
have held, maybe 1 in 50 people programmed outside of work and maybe 1 in 20
chose to learn beyond the minimum required to get by.

Startups seem to have more of these types and big companies, fortune 500 sized
seem to have almost none.

As for how it betters oneself to learn, you cannot know until you learn it.
The new tech exists because someone tried to solve the problems of the past,
sometimes it works. You can't know the benefits of object oriented or
functional programming if you are still writing the same cobol you did in the
past or for a more modern analogy you will need to stand up more apache
servers to get the same throughout if you have never learned about nginx.

~~~
tunichtgut
"In most Professional positions I have held, maybe 1 in 50 people programmed
outside of work and maybe 1 in 20 chose to learn beyond the minimum required
to get by."

Double signed. Barely anyone is using his free time to learn more than needed
at the workplace (which means the employees are stuck at the "minimum
knowledge level"). And this is not just true for the IT branch.

------
girkyturkey
At jobs like this article is referring to, we need to remember that you’re
never going to improve in life if you keep competing with people who stink. No
offense, those who are not well versed. You gotta challenge yourself. If you
don’t, complacency sets in. And bad things happen when you become complacent.

------
ourmandave
I'm worse at what I do best And for this gift I feel blessed

Kurt Cobain

