
Being confidently programming language agnostic - ergot
https://blog.bradfieldcs.com/in-2017-learn-every-language-59b11f68eee
======
cryptoz
Discussion a few weeks ago:
[https://news.ycombinator.com/item?id=13291593](https://news.ycombinator.com/item?id=13291593)

------
pfarnsworth
I've been programming for 20+ years, and recently moved over to Python. Sure,
I could code on day one and figure out how to get programs working pretty
easily. But the nuances with it are still things I need to work on a lot. I
still don't program Pythonicly, I program like a C programmer writing Python.
In fact, I probably program in all languages like I would a C programmer, and
that's not good enough, in my opinion.

I have fallen in love with Python because it's so damn easy to get productive,
and I really want to be a great Python programmer. That takes a lot more time
than the OP suggests, and requires you to immerse yourself in the patterns of
the language and in the community, in my opinion. Not just dabble a bit and
then check a box saying "I'm a polyglot!"

~~~
pmontra
It takes a while to get used to the idiomatic constructs of a language.
Luckily it's much easier than with natural languages.

My first Ruby programs were very Java like. I doubt that my first Java
programs were C like, if you use classes and methods it just can't be. No
problem with using Python and Ruby together. They are maybe like German and
English, close but clearly distinguishable. And Javascript, Perl, PHP,
Elixir... too long to write about.

A consequence of multilingualism is that one starts noticing the differences
in the implementation of the same features in different languages. Some are
smooth, others are frustratingly hard to use or to remember. A quick test on a
trivial nuisance, you must not Google it: in Python it's array.join(",") or
",".join(array)? And "1,2,3".split(",") or ",".split("1,2,3")? I remember only
that "," goes to the opposite ends in the two expressions and I can't
understand why that should be good.

~~~
pvg
The first one is neither - it's string.join(iterable). If you keep that in
mind then the design becomes clear - it would have been weirder to force every
iterable to have some string-related method. Better to put it on string, where
it more reasonably belongs.

The second one is the typical OO pattern where you ask an object to perform
some operation on itself. This is how split works in just about every OO
language - if you have trouble remembering it, it might be helpful to remember
split can be called without any parameters - in that case your alternative
variant won't make any sense.

~~~
pmontra
It makes more sense now, thanks. I missed that the argument is an iterable.
Ruby's join is a method of Array and of nothing else. Examples with ranges:

Ruby's range must be converted to Array.

    
    
        > (1..20).to_a.join(",")
        "1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20" 
    

Python's range must be converted to a list of strings.

    
    
        > ",".join([str(i) for i in range(1,21)])
        '1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20'

------
chrisfosterelli
I know formal education often gets a lot of criticism around HN, but I think
the approach the article is talking about is heavily mirrored in most
university computer science curriculums.

Universities tend to focus on paradigms and patterns, and typically force a
student to learn at least 3 languages throughout their education (much more if
they want to). Just in my undergrad I learned C, C++, C#, Objective C, Swift,
assembler, Java, Go, Javascript, Python, and Haskell. My personal experience
is that university grads are much better at adapting to new languages than
someone with 4 years experience in only a single language.

That's of course not to say you can't do the exact same sort of education
without going to school (and probably in less time).

~~~
gravypod
I think the top-10 universities do it right and the rest don't. (This is a bit
of an exaggeration but you most likely know what I mean).

Most of my classmates don't usually "learn" a language. They do an assignment
by copying, hammering at the computer, or just asking for help. The basis of
how university is set up is antithetical to the learning and exploration of
new programming languages. This is based on one simple thing that most
universities do and is very easy to fix.

    
    
       Don't mandate a programming language for you assignments
    

If this one thing was done, we'd probably instantly see a much higher failure
rate and a much hire quality of turnout. If early on in the curriculum
students got a tour of every one of the big name languages and got to just
choose the best tool for the job for the rest of their time at university
there'd be a much closer approximation to how things (at least for me) work.

If one of your projects is something like "scrape a webpage for X data" you
wouldn't want to write that in a bash script (which I've been made to do [0])
I'd want to do that in Python with BS4. Or if your project is to write a
parallel dot product function you wouldn't want to write that in C (which I've
been made to do [1]). I'd want to write that in Julia.

Even in my class that I took for exploring programming languages we were
forced to use C++. We were writing an interpreter using C++ which I'd rather
have done in some Lisp-like languages.

Unfortunatly I've not been able to make the design decition to play with other
language (in school) to see how they will better impact the development of
these applications. I've not had to prototype stuff to see what language it
will work best in (in school). These are decitions made for me.

I've found that this isn't how things work, at least for me, and I'm the one
who is told "Do X" and I pick a way to do it. Whether it's by setting up a
spreadsheet it a macro in it that generates the data or writing some real
code. I get to choose the most elegant solution and I suffer the consequences
then I didn't choose the most elegant solution since I have to my software
when it breaks 2 years down the line.

[0] -
[https://web.njit.edu/~sohna/cs288/hw3.html](https://web.njit.edu/~sohna/cs288/hw3.html)
[1] -
[https://web.njit.edu/~sohna/cs288/hw10.html](https://web.njit.edu/~sohna/cs288/hw10.html)

~~~
CJefferson
There are three problems with "Don't mandate a programming language" \-- I'm a
lecturer and I've done it for advanced practicals in later years.

* Students expect to be able to get help when they have problems. There is a good chance no member of staff knows Julia / Moonscript / ...

* Some languages make tasks trivial -- while this is nice when you are in the real world, if I want to test student's ability to create something I don't want some students missing most of the work. How do I then mark it?

* Similarly, if the question was "implement a malloc-like memory manager", well you really have to do that in C,C++,Objective-C, maybe Rust, but it makes less sense in python.

Also, getting a "quick tour" of (say) C++ isn't really useful, students who
try to pick it up by just googling are likely to write terrible code. Learning
a language properly takes work.

~~~
gravypod
> Students expect to be able to get help when they have problems. There is a
> good chance no member of staff knows Julia / Moonscript / ...

For this I have two answers. Past the first year people shouldn't be getting
help with "my code won't compile". They should be able to develop the skills
needed to search that on google and find SOF links.

The second answer is a question: Why don't members of staff know "Julia /
Moonscript / ..." and if they don't why can't they logically reason about
what's going on in the language without having used it? I don't know Go but
when people have asked me to look at some Go code to see there is a bug I can
still reason about what's happening. Isn't that what this article is about?
All languages are just mix-matches of common idioms with new ways of
expressing them. If the best of the best, those who are teaching the future
generations of computer scientists, can't do this it would seems strange to
me.

> Some languages make tasks trivial -- while this is nice when you are in the
> real world, if I want to test student's ability to create something I don't
> want some students missing most of the work. How do I then mark it?

If the student understands how to use the abstraction then they have likely
learned something far more valuable. If you're assigning labs that consist of
basic idioms that can be whisked away by common library functions then you
might consider changing your curriculum to focus more on solving problems
rather then codifying solutions.

> Similarly, if the question was "implement a malloc-like memory manager",
> well you really have to do that in C,C++,Objective-C, maybe Rust, but it
> makes less sense in python.

I see no reason why you'd have to write that in C/C++/Objective-C/Rust. If
you're going after the idea of writing a working memory manager, and not write
me a kernel that has a memory manager, then Python would work great for it.
Here is an example:

    
    
       class Allocation: 
               def __init__(self, start, size):
                       self.start = start 
                       self.size = size
       
               @property
               def end():
                       return self.start + self.size + 1
    
       class FirstFit:
               def __init__(self, total_memory):
                       self.allocations = []
                       self.total_memory = total_memory
       
               def alloc(self, size):
                       if not self.allocations:
                               allocation = Allocation(0, size)
                       else:
                               allocation = None 
                               for i in range(len(self.allocations)):
                                       current = self.allocations[i]  
       
                                       if i + 1 < len(self.allocations):
                                               # Check to see if we can fit inbetween this current allocation and the next
                                               if not self.allocations[i + 1] - current.end >= size:
                                                       continue
                                       else:
                                               # Check to see if we can fit inbetween this end allocation and the end of memory
                                               if not self.total_memory - current.end >= size:
                                                       continue 
    
                                       # We can so allocate
                                       allocation = Allocation(current.end, size)
    
                               if not allocation: # We can't so return NULL
                                       return None
    
                       self.allocations.push(allocation) # Store this allocation in our memory allocation table 
                       return allocation[0]
               def free(self, start): 
                       for i in self.allocations:
                               if self.allocations[i].start == start: # Find out pointer
                                       self.allocations = self.allocations[:i] + self.allocations[i + 1:] # Slice it out
                                       return True # We made it!
                       return False # We couldn't find this! PANIC!
    
    
    

This is crappy code but it can be done very eligently and I think this gets
across the theory better then doing this in C. In this you can also experiment
is far more complex datastructures easily. (What if I think of memroy as a
Tree and divide the value of my node by 2 every time it's size is too big?)

> Also, getting a "quick tour" of (say) C++ isn't really useful, students who
> try to pick it up by just googling are likely to write terrible code.
> Learning a language properly takes work.

I'd say that's just because of the poor design of modern C++. You can do a
quick tour of python and easily get basics, of C and easily get the basiscs,
of Java and get the basics, of Common Lisp and get the basics. You don't need
to master a language to see where it is applicable.

~~~
CJefferson
> Why don't members of staff know "Julia / Moonscript / ..." and if they don't
> why can't they logically reason about what's going on in the language
> without having used it?

If the bug is a shallow/algorithmic bug, that's reasonable. Recently I was
helping someone debug some javascript code. Eventually, we found the problem
was that, in javascript, that [11] < [2]. I'm not particularly knowledgeable
on javascript, so I didn't know it did < comparison of arrays by string
comparison. That's the kind of thing that really needs.

> If the student understands how to use the abstraction then they have likely
> learned something far more valuable. If you're assigning labs that consist
> of basic idioms that can be whisked away by common library functions then
> you might consider changing your curriculum to focus more on solving
> problems rather then codifying solutions.

This I just have to disagree with. I think it's valuable for students to learn
how to implement quick sort. It's useful to learn how to implement big-integer
arithmetic. It's useful to learn how you can "fake" Java-style inheritance
with structs and function pointers, so you really understand what's going on
under the hood.

Of course long term, you wouldn't typically implement these things yourself,
but understanding the fundamentals is important.

Also, if I set a practical which involves (for example) connecting to a HTTP
server, intending them to do the raw connection themselves, and they use a 3
line python program, using the standard library, have they really learnt
anything at all?

There certainly is a place for giving students more freedom, particularly in
later years. It's clear the modern world is moving into "slap together 50
javascript/python packages with string" type programs (and that's because it's
a great way to be productive quickly), which universities don't currently
teach that well. But don't throw the baby out with the bathwater!

~~~
gravypod
> This I just have to disagree with. I think it's valuable for students to
> learn how to implement quick sort.

I'd rather the students kow how to implement large software architectures,
keep line counts down, abstract problems correctly, and learn how things are
done in the real world.

> It's useful to learn how to implement big-integer arithmetic

Not in my opinion. Maybe, maybe, show them how emulating FP math works but
writing big-integer arithmatic functions is pretty useless for most people and
is far too strait forward to require them to develop their skills of
development software architectures.

> It's useful to learn how you can "fake" Java-style inheritance with structs
> and function pointers

You can't force them to learn patterns, you can only give them work that is
better suited to using the patterns provided. You can even hint to your
students "Hey you can make a get_car and get_bike and make a Drivable struct
that has a Drivable->stear() and stear can be a function pointer!" Forcing
them to use a pattern isn't useful.

> so you really understand what's going on under the hood

Using function pointers isn't really correct for how Java stores class/object
information. This is kind of only used in virtual functions IIRC. When I've
decompiled static bytecode you see stuff like LString(some function).

> Also, if I set a practical which involves (for example) connecting to a HTTP
> server, intending them to do the raw connection themselves, and they use a 3
> line python program, using the standard library, have they really learnt
> anything at all?

You're assigning the wrong problem. Don't say "Make and HTTP request" say
"implement an HTTP header parser". The problem is now language and abstraction
agnostic and involves a much more complex problem who's complexity lays in the
realm of the software organization. The HTTP Parser code can be used to read
and write requests and the next lab can be to use your new library in a larger
project. I think that is far more useful.

> There certainly is a place for giving students more freedom, particularly in
> later years. It's clear the modern world is moving into "slap together 50
> javascript/python packages with string" type programs (and that's because
> it's a great way to be productive quickly), which universities don't
> currently teach that well. But don't throw the baby out with the bathwater!

There is a measurable reason for this and it's not because of productivity.
It's about maintainableilty, consistency, and bug erradication. I'd love for
you to read this paper Do Code Clones Matter? [0] to see what they have found.

Making it so people know how to:

    
    
       * List all the featurs that a library will need
       * Take those features and write them in a clean API
       * Do it in the most clean language-specific way (ex Pythonic code) 
       * Distribution methodologies
       * Maintainablilty and support of these libraries
       * Using these, and others, libraries in larger applications
       * Documentation & Technical writing
    

I'd really love it if you could email me and follow up after you read that
paper and tell me what you think. My school username is jk369 and my school's
email server is @njit.edu. (I've split this up to avoid spam)

[0] -
[https://arxiv.org/pdf/1701.05472.pdf](https://arxiv.org/pdf/1701.05472.pdf)

~~~
CJefferson
I will look at your paper, but I think it depends what your target is, for
students.

Many of our students go on to do PhDs. For that an understanding of deep
algorithmics is much more important than being able to use and distribute a
library, or building larger applications. They need, in their field of
student, to be able to reimplement and understand many important difficult
algorithms, not (for example) put together some node.js libraries.

However, there is a place for that kind of degree. Someone once said to me
something which stuck with me: "You wouldn't try to merge a maths, and
accounting degree, just because they both contain numbers", yet that's still
what we do in computer science.

~~~
gravypod
> I will look at your paper

I just want to make sure, and state for everyone, that I'm far too lazy to do
work that good. That's done by people much further along in their development
of their profecian then I. Not my work just something that I think is good.

> Many of our students go on to do PhDs. For that an understanding of deep
> algorithmics is much more important than being able to use and distribute a
> library, or building larger applications

It depends on what their PhD or major is. Many CS majors here go into math and
physics and they would have done better to learn how to write pythonic (or
matlabic?) code and learn how to correcly design APIs. For instance, tonight
I'm going to be rewriting a library from Python 2 to Python 3 and I have no
way to tell if there's been any regression because of a lack of testing
frameworks, consistentcy in APIs and lack of modularity. I'd be basically
impossible for me to mock even single parts of this.

> They need, in their field of student, to be able to reimplement and
> understand many important difficult algorithms, not (for example) put
> together some node.js libraries.

I don't think left-pad is a good charictarization of my position here. I think
that the hardest part of large scale software development is the architecture
portion and that's mainly because very few people ever actually start large
complex projects on their own and no universities offer courses in such a
field that practicies that.

> However, there is a place for that kind of degree. Someone once said to me
> something which stuck with me: "You wouldn't try to merge a maths, and
> accounting degree, just because they both contain numbers", yet that's still
> what we do in computer science.

The moment you, or one of your coworkers, creates a major like that (Software
Engineering and Developemnt or something) I'll be transfering over from my
Computer Science degree. It's useless for anything but the name and the
prestige that gets my foot in the door for a lot of very fun and interesting
opportunities. I've had to hobcobble together everything I've needed to learn
from a software perspective on reverse engineering, bare-metal systems
devlopment, assembly, networking, game development, operating system
development, web development, and many abstractions/patterns that go along
with all of those. It was very difficult and I'd rather have given 30k/year
over 4 years to a school who could teach me the "real" way from
"professionals" and end up with a piece of extremely expensive framed paper
that says my name and "degree" on it.

I'd also like for a school to hold my hand while exploring different
paradigms.

------
SimonPStevens
I used to arrogantly think I could be productive in any language given a week
or so to adapt. I often used the phrase "a good dev is a good dev in any
language." That belief was rather abruptly broken when I joined a project
using C++/CX it was so far outside of my previous experience that I had a
really bad time of it.

I happily picked up the actual language very quickly. It was the surrounding
ecosystem of compilers, build tools, debugging tools, libraries, standard
patterns and best practises that was too deep for me to become proficient in.

That said. Learning every language you can is definitely beneficial. Just
don't expect to hit the same level of productivity in all of them.

~~~
imdsm
I agree. It's not always the language which makes it tough to integrate
yourself into a project, sometimes it's all the frameworks, libraries, build
tools, and general makeup of the project itself.

I worked for a company a few years back doing ASP.NET (my experience is mostly
Linux based tech, but you can pick up a language, right?). The language was
fine, the design patterns used all quite standard, but the project had an
array of frameworks and libraries which made maintaining it painstaking.

In fact, I think you worked on it too.

~~~
jghn
I still find the "frameworks and libraries" argument to be false, but enough
people say it that I believe it must be at least partially true. I'll settle
for a belief that the argument is overstated.

IMO the largest issue is the breadth of one's experience, or lack thereof. And
by that I don't mean simply the number of languages but how well versed one is
in _different_ languages and their surrounding ecosystems. For instance having
good knowledge of both C# and Java isn't what I'm describing here.

~~~
SimonPStevens
I think it's about how much newness you have to handle at the same time. I can
happily take a project using a framework or two I've never touched before. Or
something in a new language but similar domain. But when you have the perfect
storm of new language, tools and framework (and maybe OS) that's when it
hurts. I'm not saying I totally couldn't switch, but it made me change me
attitude a bit from '2 weeks and I'll be back at full productivity' to
something more like '2 to the power of number of new things'. The more newness
that has to be handled at the same time, the longer it takes to pick up all
the interconnections.

~~~
jghn
This is also true, I would say it's a combo of the two. The more new things at
once the more likely it is that you'll be working with something without a
good correlation to something in your previous experience. But at the same
time having a broader and richer base of experience will make it more likely
that you'll have an easier go of it for any of those individual things.

------
achikin
I think nowadays people spend too much time learning new tools and too less
time doing something really valuable using that tools. I wish I had only one
language, so I can concentrate on more interesting things rather than learning
yet another random set of operators and library function's names.

~~~
paulddraper
While using fewer, better languages for more is fantastic, I doubt you really
want a single language.

Try replacing shell, Structured Query Language, and C with the same language.
I won't say that it can't be done, but I think you'll lose a lot if you
succeed.

~~~
jononor
I'd hope that with embedded DSLs we can get closer. People are already doing
lots of it in Haskell, including domains you listed. Of course there is always
the risk that one invents an "inner language" with poorer semantics and tools.

~~~
eru
Haskell is a pretty good host for DSLs. But if you want to go lower level than
Haskell, you have to essentially write compilers for your embedded DSL rather
than the usual interpreters.

And of course, Haskell's type system is not endlessly flexible (yet..). Eg
Haskell still struggles expressing relational programming or linear types /
uniqueness.

~~~
jononor
Yes the low-level DSLs tend to become their own compilers. But the good thing
is that they as a side-effect also have an API, so they can hopefully be
reused for new DSLs. Interoperability of different DSLs does not neccesarily
follow though, unfortunately...

~~~
eru
Yes. And of course, you still need to write a decent compiler to produce
decent code.

The situation is similar to Lisp macros: yes, you can implement Prolog in
Common Lisp in a few lines, but no, it won't be a fully featured and fast
production system, unless you actually put in the work. (Paul Graham's 'On
Lisp' makes these excellent points in the chapter on the Prolog interpreter.)

Of course, you might want to go all the way to dependent typing. I think one
of Ysabelle or Idris actually compile to 'low-level' languages like Haskell by
default?

The main benefit I would like to see in Haskell is totality / termination of
programs by default, and hiding Turing completeness behind something like
unsafePerformCompute. Similar, we could split IO into IOReadWrite and
IOReadOnly.

The former would be the same as the old IO, the latter's actions could depend
on the environment but wouldn't be allowed to influence it (or weaker: would
at least require idempotence?)---thus allowing more scope for optimization and
human understanding when reading code.

------
RivieraKid
After you have some experience with, say, 3 to 5 different languages, you can
learn new ones very quickly (hours, days) with little effort because you know
almost all of the core concepts (OOP, FP, mutability, generics, ...). There's
no point in learning "all of the languages" for its own sake. Just learn a
bunch of different ones to learn different ideas and programming styles.

What takes more time is the standard library, common libraries, tools. A good
example of that is iOS development in Swift. Learning the language is almost
negligible compared to learning all of the technologies and tools. But coming
from Android, I think I'm close to about 50 % of full productivity after two
weeks.

I never understood why there's so much focus on the "x years experience in y"
in the industry. A solid developer should be able to become fluent in any
language and technology within a month or two.

~~~
protomok
"I never understood why there's so much focus on the "x years experience in y"
in the industry. A solid developer should be able to become fluent in any
language and technology within a month or two."

I don't think any developer can become fluent at a new language in 1-2 months.
I have seen so much terrible code over the years the compiles and appears to
run just fine but with massive performance issues, poor design, unreadable,
security issues, memory leaks, etc. etc.

I see learning the syntax of a language to be something like 0-5% of the
effort of mastering a language, knowing how to properly use a language really
does take years. Although I agree that # of years of experience is not a
perfect metric.

------
dvcrn
I'm glad that the trend is slowly going towards hiring generally smart people
instead of a "PHP/Rails/Python programmer". I'd say, better hire people that
can easily adapt something new if needed rather than saying "I am a PHP
programmer, I don't want to do this new project in Ruby." (I met my good share
of people in previous companies thinking in that way to the extend that they
would quit if asked to write in something else than what they were hired for)

In my current company they were looking for a Go programmer. I never wrote a
single line Go before, yet they hired me and now I am writing Go. Then we had
a deficit in the iOS team, so I learned Swift and now work on our app.

------
salex89
I have a sort of an issue with learning a lot of new languages. As much as I
like to try them out, I like (and have) to learn things like big data
processing or devops stuff and distributed systems and so on. And to be
honest, when I get into the theory of those fields, the language does not mean
so much. At the end I end up using what the targeted industry is searching
for, and they do not need languages, they need solutions. And the language is
rarely a solution.

In that set of things, a new language is somewhere lower in the priority list
for learning new things. Sry.

------
sriku
A few recommendations on this, since at where I work a talk is organized on
this very topic today.

1\. A great book that covers multiple paradigms of programming is Roy and
Haridi's "Concepts, Techniques and Models of Computer Programming"
[https://mitpress.mit.edu/books/concepts-techniques-and-
model...](https://mitpress.mit.edu/books/concepts-techniques-and-models-
computer-programming) . This stands hand in hand with the better known SICP.

2\. Folks in the CS and programming world seem to ignore bleeding edge work
being done in the arts space. To get a broader view of languages than
"characters that go into a plain text file", expose yourself to the live-ness
of the following -

2.ø Smalltalk - one of the first fully available language and runtime that is
still usable today.

2.a Max/MSP/Jitter - by David Zicrelli and Millet Pickette's - Visual data
flow programming language with decades of dominance in the Computer Music
scene.

2.b SuperCollider - for architecture lessons as well as another multi-paradigm
language.

2.c Impromptu - a Scheme based live coding environment for music and visuals
by Andrew Sorenson. Normal REPLs will bow in front of most "live coding"
languages used for music.

2.d Ixilang by Thor Magnusson - another live coding language, where the
language is in a sense inseparable from its run time environment. The current
running behaviour of a textual program could also depend on how the program
evolved.

In short, break out of normal modes of thinking and attain Turing nature, at
which point you can proclaim that all languages have Turing nature and yet
retain your discriminating view.

~~~
tjl
A few other data flow programming languages, LabView (which has been around
for a very long time) and is extremely popular for data acquisition (it's
basically the goto choice most of the time) and the Houdini computer graphics
software (there's other examples in CG as well). I pick Houdini because
there's a version people can download and try. It actually works pretty well
for most things.

------
aryehof
I feel we are starting to lose the plot. The problem isn't about learning
another language or which to use next. Instead it is about how to solve and
represent complex problems and systems in code.

Any language is a means to an end, not the end itself.

Focusing on languages and language constructs is of value to those in academia
and those working solely in the domains of computing and computer science. For
the rest, it is the equivalent of navel gazing, the equivalent of focusing on
grammar, when the task is authoring a novel.

~~~
dwaltrip
I agree whole-heartedly. However, as we take on projects of larger scope and
complexity, reflecting on the craft in a measured and skeptical way can be
powerful and productive. It's a tricky line to walk.

------
general_ai
Languages are easy, libraries are hard.

~~~
jedimastert
This seems to be the message fron most of the comments

~~~
general_ai
I'd be surprised if it wasn't. This is immediately obvious to anyone who's
tried to solve practical problems in languages, rather than just learned them
for resume embellishment purposes. That's also why sane companies limit the
languages that can be used for development: if you let your "theoreticians"
run wild, you will soon end up with code written in Brainfuck and Malbolge
using libraries no one understands. I'm exaggerating, obviously, but only
slightly.

------
protomok
Interesting article, love the emphasis on continual learning.

But I actually think we (software devs) need to focus more on mastering
languages as opposed to learning many languages at a surface level.

Books like "Effective C++" ([https://www.amazon.ca/Effective-Specific-Improve-
Programs-De...](https://www.amazon.ca/Effective-Specific-Improve-Programs-
Designs/dp/0321334876)) really showed me the huge divide between knowing a
language and mastering a language.

------
zzzcpan
I think Norvig and the author are wrong about "parallelism" requirement. I
mean it's good to know what's out there, but you can't really understand it if
you try to learn it from a multithreading point of view. Fundamentals for it
are part of distributed systems and this is where people should get into and
learn things about ordering, consensus, asynchronous and synchronous systems,
etc.

~~~
eru
There's of course also the distinction between parallel and concurrent.

~~~
zzzcpan
Yeah, they are mixing all kinds of things together, Go with Erlang and even
throwing a GPU in there. These are all very different worlds and none of them
teach you fundamentals. I remember how useless the parallel programming class
was, I only started to understand the underlying ideas behind semaphores,
mutexes and memory barriers long after I got into distributed systems.

------
ar15saveslives
It's far from enough to know a programming language. If someone knows how to
write binary search on C++, he can rewrite it on Python, JS or C# pretty
easily.

But software development is not about writing pure functions, it's about
writing applications, so frameworks and good practices is more important than
the language.

------
didibus
What you should really do is learn the concepts of structured programming.
Every language is just a notation on top of a fundamental programming
construct. If you learn those instead, you'll know every language.

I recommend reading Exercises in Programming Style by Cristina Lopes as a good
starting point.

------
known
CPU is managed by the Kernel, and Memory is managed by You in C/C++;

CPU is managed by the Kernel, and Memory is managed by Perl/PHP/Python/Java;

------
lucidguppy
I think that learning C before C++ might hurt you.

[https://www.youtube.com/watch?v=YnWhqhNdYyk](https://www.youtube.com/watch?v=YnWhqhNdYyk)

~~~
SlySherZ
Note: I didn't watch the video yet and my opinion might change by then.

C teaches you something very useful, which is how the computer works under the
hood. Most things you do in C loosely match to what the computer does, and
learning C helps you understanding the computer.

I'm not saying that it's a good language to start with, but it's okay. Unlike
C++.

IMO, C++ is the worst possible language to start with. The reason for this is
that C++ tries to support everything, often using handy but cryptic syntax.
C++ doesn't guide you to program one way or another, it simply adapts to every
possible way to solve the problem. This might be good on the hands of someone
that already knows how to solve the problem, but it's completly overwhelming
for a begginer. It feels like you're taking the wrong turn at every step.

A much better alternative would be JavaScript. Yes, it does have flaws, but
it's much simpler and has personality.

~~~
lucidguppy
The argument is that learning C before C++ teaches you how to program C++
wrong. Nothing more, nothing less.

I wasn't judging C++ as a beginner language. I wasn't judging C either.

~~~
userbinator
IMHO "wrong" being mostly the opinion of the evangelists who think C++ is a
completely different language, and have been trying to push it in that
direction, but it's still mostly C at the core with a _thick_ layer of
abstraction/indirection on top. Don't believe all the hype. "Climbing the
ladder of abstraction" is the best way to learn how everything works.

------
Koshkin
An excellent article, well worth reading. As it points out, implementing the
idea of learning a programming language for the sake of learning requires
being highly selective in deciding which languages to learn. One language I
would advise against learning is C++ - because, first, it may take too much
time to learn to be worth the effort, and, second, the language itself is not
all that interesting, and its standard library (especially STL), while does
bring some novel ideas to the table, is so heavily affected by the particulars
of the language itself, that maybe the D language would serve the same purpose
better. Also, please do yourself a service and learn C# instead of Java. (This
is not to create any doubt in the _extreme_ usefulness of learning both C++
and/or Java for practical purposes.)

~~~
voltagex_
> Also, please do yourself a service and learn C# instead of Java.

Nah. Learn whatever takes your fancy. Or whatever you think will get you a
job. Or learn whatever you can get tutored in because it's almost always
easier to learn when you have someone you can ask questions to in person.

Learn structure and logic, but don't tell anyone they're less of a programmer
because they use X instead of Y.

