

What you know matters more than what you do - ridruejo
http://calnewport.com/blog/2012/06/12/what-you-know-matters-more-than-what-you-do/

======
rweba
I am also a computer science assistant professor like Cal Newport so I can
relate to what he is trying to say here.

An academic paper in computer science usually consists of applying a well
defined "technique" to a well defined "problem." For example applying "Support
Vector Machines" to "Text Classification" or applying "Expert Systems" to
"Medical Diagnosis" or applying "Particle Filters" to "Robot localization"

The crux of what Newport is referring to is that in practice, many(most?)
academic researchers don't deeply learn new techniques that are outside of
their immediate research agenda after leaving grad school. They will certainly
be AWARE of new techniques and might learn their high level ideas but they
won't really learn them deeply enough to be able to improve them or use them
in a non-trivial way. It is much much easier to continue exploiting and
building upon the techniques they have already mastered than to invest several
painful months(years?) to master completely new techniques.

So what Newport is suggesting that you should stop applying your current
methods that are easy and very productive for you and make a substantial
investment of time and energy to master the latest techniques - _even if you
don't exactly know how you're going to apply them_

A programming analogy: If you're a C++ programmer, stop doing C++ projects and
spend at least 3 months until you're an expert in Python. If you're a Python
programmer, stop writing Python and invest several months in becoming an
expert in Go. (the analogy is not perfect because mastering a new language is
probably a bit easier and more fun than the kind of things Newport is talking
about)

Here are a couple of thoughts I had on this:

(1) How applicable is this idea outside of academic research? For example, in
academia there is a big reward for being the FIRST person to apply a given
technique to a certain problem, but outside of research being the 2nd or 3rd
person to do something can be just fine. (See: Friendster, MySpace, Facebook).
So maybe you can afford to wait until someone has shown a great application of
a new technique and only then jump in and try to exploit it.

(2) An opposing but also convincing idea is that it is better to focus your
efforts in one area to avoid spreading yourself too thin. Such focus allows
you to gain "comparative advantage" and to easily do things that are difficult
for people who don't have your deep experience. In other words, it is better
to spend your efforts trying to become the world's greatest Python hacker than
to jump on the bandwagon of every new programming language that comes out and
ending up as a "Jack of all trades, master of none."

My conclusion: I do think it is worthwhile to challenge yourself not to just
stick to what you already know (which in my personal experience is VERY easy
to do especially if you find that you're very productive using what you know).
But you also have to be selective. Life is too short to try to be a master of
everything. And there is great value in gaining a very deep expertise in a
particular topic or technique.

~~~
loxs
I can confirm that this works for a programmer. I didn't even know that I
function like this, before reading the article. I will go even further, and
state that it works for everything.

And it's very often better to be a "master" of several trades, than a "grand-
master" of one. Of course, if you are _the_ grand-master of a trade, things
are completely different, as we all know.

I was once a medical student. But before that, I wasn't attending a high
school specializing in natural sciences, but an "English Language School" (I'm
from Bulgaria, Eastern Europe). So knowing English made me a better medical
student. One day, I discovered the general purpose computer, and I became
obsessed with that. Being good at computers, made me even more outstanding
amongst my colleagues. I learned Linux, server administration, network
administration and I was a god.

I decided to switch careers (computers are so much fun), and I became a
programmer. Lacking the education, I started at the lowest possible position -
a Junior Visual Basic programmer in a company, close to the end of the world
:). And guess what, I was WAY better than my colleagues. I had almost no
experience, but I was already a scientist. You won't believe me, but there are
so many programmers out there who are not scientists. And I knew Linux, I knew
networking. I was learning Python in my spare time, and I started discovering
problems, so easy to solve with Python. I was already a god at that company,
from day one. Not to mention that I was already learning "web". HTML, CSS,
JavaScript, web servers, reverse proxies, "http accelerators", databases etc.

I was very fast to land a python web programming job after a few months of
"experience". And guess what? I was already learning "NoSQL" databases in my
spare time. I knew how to solve problems with CouchDB. We started implementing
stuff in CouchDB. I also became obsessed with XMPP, and started using it to
solve more problems. That's how I found Erlang (ejabberd).

I learned Erlang, and I was now able to solve even more interesting problems.
The new job offer followed...

Now I write Erlang services (and I utilize XMPP), and I find new problems to
solve with my new tools - Redis, Riak, riak_core. And I am exploring more
tools. I wonder what will be the next big thing. Haskell? Ocaml? Lisp? C?
(yes, I have yet to utilize the power of C). But I am certain it will follow.
I can't help being myself.

During all this time (roughly 5-6 years since I program), I never really
became a "grand-master", but knowing all these tools is really a new
experience of its own.

The single tool that I left behind is Visual Basic (never really liked it
anyway). All the others are of great use to me till now. Even Medicine. It was
the best one to teach me how to deal with complex systems.

~~~
jzilla
Could you elaborate on what you mean when you say are a scientist and there
are many programmers who are not scientists?

~~~
npsimons
_Could you elaborate on what you mean when you say are a scientist and there
are many programmers who are not scientists?_

I presume he's talking about people who probably shouldn't be programming
(those that routinely write code the likes of which show up on
thedailywtf.com; you know, stuff that assigns variables multiple times in a
row). Apart from that, it's also telling that even most "computer scientists"
don't know much outside of programming, even for the domains they work in. I
count myself lucky that I get to work with people who have PhDs in basic
sciences on a daily basis; just keeping up in work discussions requires
delving into some really interesting science papers and books. Granted, I
couldn't write you Dijkstra's algorithm right away, but that's why I keep the
CLR book around.

~~~
loxs
Correct. I am talking about people who are not very good at logic and
scientific reasoning. People who know how things "are" (because someone taught
them so in the university) and don't care _why_. People who blindly follow
"design patterns" and use their "hammer", regardless of the screw they are
trying to deal with.

People who say "I only learned Pascal at school. Let's use that for the
problem at hand" (No offence to Pascal and people using it).

People who know the Java standard libraries by heart (and have an excellent
diploma because of that), but can't produce a simple working program, because
they lack understanding of the nature of computing.

The above are real examples of people I have worked with.

------
SatvikBeri
I think this point is definitely true for an academic environment, which Cal
is in.

For the average workplace, the ultimate determinant of your success is what
other people believe you have done and can do. This is certainly correlated to
what you've done, what you can do, and your ability to promote yourself.

In my personal experience, what you choose to do is more important than what
you know. I've mentioned a few times that I saved previous employer X
$2MM/year, and I did this through finding several problems that seemed like
they would be possible to automate, then learning how to automate them-not the
other way around.

~~~
roc
In my experience, the "learning how to automate them" implies the analogous
step to the "technique"-learning being discussed in the article.

e.g. To automate a health care provider's intake and records system well, you
really do have to learn a surprising amount of how those people currently do
their job. If you understand the job of the intake nurse, the needs of the
physicians' assistants, the way records are processed for insurance reasons,
the way doctors search and browse, etc., you can save vastly more time and
money (to say nothing of improving accuracy and reducing redundancy) than you
would with a straightforward records digitization process.

------
mgrouchy
Interesting that the title of the blog post is "What you know matters more
then what you do". I would say this thesis is incomplete. The author talks
about how him and his colleagues were trying to figure out the secret sauce of
a high achieving researcher.

FTA:

 _I hypothesize two things. First, ultra-learning is difficult but it can be
cultivated. Second, it might be one of the most important skills for
consistently generating impact. Those who are able and willing to continually
master hard new knowledge and techniques are playing on a different field than
those who are wary of anything that can’t be picked up from a blog post_

I would possibly agree with his hypothesis at the end of the post, but not the
title. I think its likely that this person is a high achiever not just because
he is learning quickly and thoroughly, hard topics. He is also applying the
requisite work and determination required to be a success.

So IMO what you do has to be at least as important as the things you know
because that is the only way we have in which to judge your success.

------
bicknergseng
I would tweak his hypothesis:

1\. It is now easier than ever to achieve "ultra-learning" because the vast
majority of knowledge is available everywhere for free online. The "learn to
code the hard way" crowd will disagree, but I would say working knowledge of
almost anything can be gained in between a week and a month, depending on how
specific the topic (eg creating websites: week, creating a kernel: month). You
will NOT be an expert at either, but you will be someone who can imitate.
Imitation and application of existing knowledge is extremely quick once you
learn how to learn how to learn. I would say the researcher from the article
and Steve Jobs are masters of the "working knowledge and apply" school, but
the low barrier to entry explains why every day we all read about the
"LinkedIn for this" or "Instagram for that."

2\. I would define "deep-knowledge" as knowledge that doesn't already exist or
isn't already accessible. This is true innovation or discovery. This is the
internet, nuclear fission, DNA sequencing. This kind of knowledge is rare
because it takes time, persistence, and capability. One has to gain working
knowledge and keep digging until you hit the bottom of all knowledge about the
topic... and then break out a drill bit and push on, or go to the tallest
building in the world and add a story on top.

Unfortunately, our society rewards the first category far more than the
second.

~~~
loxs
See my other comments in the thread to find out about me.

1\. This is true. Weeks to month(s) is usually enough to learn enough to
roughly be aware of the kinds of problems you can solve with a tool. Finding
the right and interesting problem is not quite as easy though. Usually after I
know enough to have a feeling about the kinds of problems, I stop learning the
new tool. It's not interesting to learn more just for the sake of learning.
Once I find a good problem(s), I put months or years in learning the tool and
solving problems with it. One of the reasons I moved away from medicine is
because you can't learn while solving problems. You have to learn _a lot_
upfront before you are allowed to the "field". This is really justified, but I
lost interest.

2\. This is not really always the case. There are fields (medicine for
example), where true innovation is not always result of "deep knowledge". For
example antibiotics, vaccines etc. In those cases true innovation happened
because of trying to solve a problem (and quite accidentally), not because
someone by his really deep knowledge predicted these phenomena.

I wouldn't try to judge which way is "better". And I wouldn't use the word
"unfortunately" for these. Probably we need very much both to progress.

------
consultutah
Just riffing on the title which some below say is a horrible representation of
the article, but "who you know, is more important than what you know."

As a old timer in the IT industry, I'd simply like to share that the
connections that you make with people are much more important than what you
know or what you do for your advancement within almost any company. Now you
can't usually be totally incompetent, but the stronger your connections are
with people above and around you in your organization, the more allowance you
ate given for mistakes etc.

What am I really trying to say? Get to know people. Don't stay completely head
down in your cube knocking out the best code. Get up every once in a while and
build relationships with the people around you.

~~~
re_todd
And it's often easier than it sounds. I observed a few popular people and
decided to make mental notes of what they talked about, and I was surprised it
was mostly trivial nonsense (who won the game, the car chase on the news last
night, etc). It doesn't matter, people appreciate that you are just talking to
them.

------
manmal
My take on productivity (it is about productivity after all, right?) is that
it matters: a) What you choose to do, and b) How well you do it. The second is
affected by the domain knowledge you have. Most people don't leave their
domain for their whole worklife (they might have secondary domains because of
hobbies), and should become experts no matter what (and therefore do well in
point b)). Multi-domain genies like Goethe, Da Vinci (, Jobs, Feynman?) have
to constantly learn hard to gain enough knowledge in many domains for them to
be successful there. Also, I am convinced that such people can do better in a
domain they have mastered because of the cross-domain analogies they can
apply.

Still, what you choose to do (point a)) is as important for productivity. I
can be a multi-domain genius scientist and still choose to spend all my time
recapitulating achievements of other people, instead of developing a new
physics model. Or I can spend years making one app completely bug-free instead
of writing 10 new ones.

UPDATE: One thing I am really wondering about is: How deep does the knowledge
within one domain reach for such personalities? Did Jobs know everything about
OS's, or just as much as he needed? My hypothesis is that they all learn just
as much as they need to achieve the goal at hand NOW (not hoarding knowledge
for the sake of it).

~~~
ArbitraryLimits
> Multi-domain genies like Goethe, Da Vinci (, Jobs, Feynman?)

I'm sorry but I just can't let this go without comment.

Steve Jobs was not a "multi-domain genius." He had a single, very specific
genius, namely gathering lots and lots of ideas and picking the very best one.
(IMO the fact that he was an asshole is probably related but not essential.)
His entire career was applying this single ability, which he really was a
genius of, to multiple domains. He wasn't a genius at anything about those
domains per se, just at what he did once he was working in them. I mean, the
fact that he had any technical knowledge at all about operating systems or
computer graphics already put him in the top tier of "business guys" but
that's not the same thing as a deep understanding.

For that matter, let's keep going on your list. Richard Feynmann was a smart
guy but all his important work (except superfluidity in helium, I guess) was
made exploring a single theme, namely how to put quantum mechanics in a
Lagrangian formulation rather than a Hamiltonian. He told the story once of
the event that sparked his first interest in physics as a career, and it was
his high school physics teacher taking him aside and explaining the principle
of least action (the basis of the Lagrangian formulation of classical
mechanics) - he basically organized the rest of his career around
understanding this principle very, very deeply. The reason he's famous in
addition to being smart is that he found a new domain to apply it to, namely
quantum mechanics.

My point is that while mastering a domain of application was important for
these people, probably even crucial, it wasn't really the engine of their
success. They first developed a single talent or interest, and then looked for
a domain to apply it to.

~~~
Someone
_"IMO the fact that he was an asshole is probably related but not essential."_

I disagree. He did not only gather ideas; he made sure that there were good
ones, too. Doing that requires probing deep, and probing deep hurts. I think
he also worked on his efficiency by dismissing people who didn't have high
ratios of good ideas, and wasn't afraid to ditch a good or even an excellent
idea for a better one. That must have hurt those with those good ideas.

Can one do that without being coming over as an asshole? Probably, but not
efficiently.

That does not mean he was an all-out asshole, though. For example, from the
little I read, Jobs cared for his family. Because of that, I would call him
ruthless, not an asshole.

It is a bit like being an world-class athlete. They will do whatever is
necessary to improve their sporting ability. You simply cannot become best of
the world by thinking "that is her only chance at the Olympics. Let her go; my
chance will come", or without sacrificing something or someone in you social
life. Does that make them assholes? IMO: no.

------
gwern
> According to my colleagues, this star researcher tends to begin with
> techniques, not problems. He first masters a technique that seems promising
> (and when I say “master,” I mean it — he really goes deep in building his
> understanding). He then uses this new technique to seek out problems that
> were once hard but now yield easily. He’s restless in this quest, often
> mastering several new techniques each year.

The Feynman algorithm for looking like a genius!

------
billswift
The commenters here seem to have missed what I thought was Cal's point: What
you know matters more than what you do, because what you can do is limited by
what you know. Learning new techniques is important because it expands the
arena of what you can do.

~~~
randomdata
I think that is a given: Once you've done something, you intrinsically know
how to do it.

More interestingly, to what degree does our knowledge play on our ability to
see new ideas? Do our brains ignore the seemingly impossible because it seems
impossible? Could someone have imagined Facebook before computers and the
internet were invented, for example? Do the ideas come first, and then one set
out to figure out how to make it work, like as it appears to happen in science
fiction? Or are those ideas already based on working knowledge of what is
reasonably possible with not-so-far-off technology?

------
jcfrei
terrible title* (not chosen by OP, mind you), that almost put me off - but a
very good article. *it should be more along the lines of: what you can learn
matters more than what you can do

~~~
nilaykumar
Yep, had the exact same reaction.

------
iuguy
For the workplace I'd say what you do is infinitely more important than what
you know, although there is a clear relationship between the two. At the end
of the day, people can _qualify and measure_ what you do. Trying to externally
measure what someone knows is pretty much impossible to the same extent.
Whether you're known as the guy the broke the build, or the guy that automated
the build process, you're known by what you do, know what you know, you know?

