
Ask Stack: Should I learn a new programming language? - shawndumas
http://arstechnica.com/business/news/2012/03/ask-stack-should-i-learn-a-new-programming-language.ars
======
sbov
My problem with this question always is: what do they mean by "learning" a
language?

Does it mean you just learn the syntax? Does it mean you can write a simple
program that works? Does it mean you can write a simple program that works and
exhibits the idioms of the language? Does it mean you can write a medium
length program without having to refer to api docs? Does it mean you can write
a complicated, robust, production quality system in the language? Does it mean
all your friends who have a perplexing problem in the language shoot you an
email because you know it inside and out?

All of these take differing amounts of time to accomplish, and all can fall
under learning a language. I think I've learned erlang at least 3 times now.
When you say you learned a language in a week or two, what does that mean?

------
wvenable
Unless you have a particular project in mind, I don't think you can just
"learn a new programming language". There has to be some context for it.

Perhaps better advice is to seek out a new project in an area you're not
familiar with and use the opportunity to seek out the best tool to solve that
problem regardless of the technologies you already know. If you need to build
a server process, maybe that would be a good time to look into node.js. If you
need or want to design a web application look into Ruby.

------
6ren
A programming language can partially influence the way you think (a form of
Whorfianism <http://en.wikipedia.org/wiki/Linguistic_relativity>), but a PL is
a poor substitute for actual thought.

Thinking in terms of any programming language limits the ways you can model
problems and solutions. Of course, learning ways of modeling expands your
repertoire. I would hesitate to say that "programming languages" are the
richest source of such models. Even within them, features like dynamic typing
and closures are less significant than logic/constraint programming - or GPGPU
languages.

On a separate issue, we can also see programming languages as an instance of
exponentially improving technology (Kurzweil). I haven't seen a chart of the
productivity improvements of "programming languages" over time, but certainly
there's no contention over the dramatic improvements from raw machine code, to
assemblers, to compilers etc.

The mechanism is the same as for increasing returns for technologies: standing
on the shoulders of giants, and seeing further, especially ways of doing
things (the meaning of "technology"). That is, we create the next generation
of programming languages using the previous generation (and benefiting from
their productivity). e.g. we wrote compilers with assemblers, then with
compilers, and then with specialized tools (yacc, lex etc).

At the forefront of technology, it's not always obvious which ways will turn
out to be best til afterwards - it's an exploration of the news, which is
unknown by definition. Anyone who thinks they know the answers before they
start exploring misunderstands the nature of exploration. Before we have
definite facts, we have opinions and contention - which is all for the good,
if it causes people to look harder rather than argue blindly.

------
hsmyers
My first admittedly smart ass remark was "If you have to ask, then the answer
is 'No'" But that doesn't treat the question as legitimate---which it is. For
me the answer has always been 'Yes' but perhaps not for the reasons found in
the remarks below the article. I'm more or less addicted to learning computer
languages. Part of that is plain curiosity---I just like to see what is in the
offering. The rest is the certain knowledge that each new language gives me a
new point of view. Sometimes a new solution to a problem; but in so far as
language shapes how things are viewed then the addition of additional material
is almost mandatory as far as I can see. I think someone mentioned "One a
year"---I don't know if I'd set a time constraint on the idea, but I'm usually
at work learning a new one at any given moment. I also re-learn old friends in
terms of using 'new to me' compilers; Borland->MVC->GCC, that kind of thing. I
also believe that anything that keeps the mind active in my chosen field can't
help but be a good thing :)

------
JVIDEL
Personal choices aside (everyone has their own) from a professional standpoint
you have to consider that even a mildly advanced Ruby programmer gets a bigger
paycheck than an experienced PHP programmer does for the simple fact there are
less Ruby programmers around and its currently on high demand.

So learning the hottest language can open doors for you, jobs-wise.

~~~
toyg
Exactly. There are some well-paying niches where a "polyglot programmer" will
be required, e.g. C++/Python, Ruby/Javascript... knowing only part of the
required combo will likely not get you the job or curtail your remuneration.

A career-minded person could just expand his core competencies in order to
nail those niches. Not everyone enjoys hacking LISP in their spare time, but
expanding your horizons is the right thing to do if you care about your job
prospects. If you write for the embedded market at a low level, pick up
something higher in the stack; if you only dabble in high-level languages, try
looking under the hood; if you specialize in particular technologies (e.g.
C#/.net), look at their complementary stuff (e.g. javascript, powershell, or
even Java interop and VC++).

Also, "learning a language" could just be "learn a big framework / toolset".
For example, if you know Python, you might want to pick up Django, or a GUI
framework like Qt, even if they are outside the boundaries of what you need at
work: they tend to have their own programming style and quirks, so they can be
key technologies to bring you new opportunities. Java has boatloads of big
frameworks, chances are that you don't know all of them. Ruby people tend to
live all the time in their Rails ghetto, but there's a world of libraries
outside that. And so on and so on.

------
shingen
I like how Python is considered a 'next new thing'. It merely predates the
web.

~~~
tomjen3
Well no. There may have been a language which was called Python but it seems
to have been grown and added to over the years (OO, a faster JIT, more
libraries) which suggest to me that the incarnation of Python which predates
the web is not the incarnation that became popular.

------
ef4
If you asked me this question, I would assume you are a mediocre programmer.
The good ones don't consider it a big investment to learn a new language,
because they can do it really fast and because they enjoy it.

~~~
tkahn6
Since when did learning a language stop being a big investment? Beyond syntax
there are idioms, idiosyncrasies, standard libraries, advanced features,
upstream features, and toolchains.

How many people do you know have full understanding of C++ or Haskell? Or even
Ruby or Python?

~~~
boyter
Don't forget debugging and reading error messages. I honestly believe that 90%
of learning any language is learning how to solve errors when the compiler or
run-time spits them out at you.

