
C is Manly, Python is for “n00bs” - kevbin
https://modelviewculture.com/pieces/c-is-manly-python-is-for-n00bs-how-false-stereotypes-turn-into-technical-truths
======
wrp
The article takes the stance that all stereotypes are groundless, but a more
realistic position is that they are often generalizations with some valid
foundation. I'll just illustrate with my experiences of Python software. I
have used many, many data manipulation, simulation, visualization, etc. tools
over the years, most of them coming out of either academia or government labs.
I long ago developed the habit, when faced with using a tool written in
Python, to look for an alternative written in C/C++, because it has been my
experience that the C/C++ alternative will usually be more stable and better
thought out. I have not worked deeply with Python or Python programmers, so I
can't offer an informed opinion of whether the main problem lies in the people
or the tools. It has been my tentative hypothesis that this trend is the
result of C/C++ placing a higher bar to use, requiring developers to both be
more capable and spend more time planning.

------
brudgers
First I was reminded of Uncle Bob's talk on programming language hubris:

[https://www.youtube.com/watch?v=YX3iRjKj7C0](https://www.youtube.com/watch?v=YX3iRjKj7C0)

but as I went deeper thought about the effects of programming language
tribalism, I was reminded of

[http://web.mit.edu/humor/Computers/real.programmers](http://web.mit.edu/humor/Computers/real.programmers)

it's reference to

[https://en.wikipedia.org/wiki/Real_Men_Don't_Eat_Quiche](https://en.wikipedia.org/wiki/Real_Men_Don't_Eat_Quiche)

and how this played out as a sound bite in US popular culture at the same time
as the decline in women computer science enrollment began to decline. Being in
the high school graduating cohort where the decline began, the decline of
women in computing a subject I'm interested in. It's one of those subtle
cultural/generational differences between me and my boy.

------
sgt101
The title should have (2015) in it.

This is a weird one, I feel like I am stumbling into a bit elephant trap, the
authors seem to know what they are talking about but have missed some
fundamental points.

Firstly there is a statement asserting that it only takes a few months to
learn a language if you are a competent developer, which I agree is true to a
point, but the assertion continues with the view that therefore we shouldn't
hire given what languages a person knows.

Now, in a big company where there is a lot of runway for a project and a good
training budget - and where you are building capability, spotting talent and
hoping to build your own cheap well integrated team rather than a bunch of
half crazed guns for hire, then yup, agree, but a bit of nuance is needed. A
lot of places cannot function with people learning things for two or three
months (or weeks, or days even). There's work to be done and it needs doing.

Then there are statements about Perl, Python and Javascript being fairly
similar - which they are in a narrow technical look how the compiler works
way. But I don't know about you (obviously) but although I loath Python
(irrationally) I would feel relaxed and happy picking up 30k loc to debug and
maintain if it were written in Python, I would be somewhat nervous if it was
Javascript and I would be having a breakdown if it was Perl.

But as the authors note, horses for courses - Python is not a great fit (ok
it's pretty good) for processing lots of strings and so on, Perl is good for
that. Javascript - guess what - I think Javascript is a boon for creating web
front ends. Perl, not a great choice.

In a similar vein I use R sometimes for interactive data mining. It is very
handy in term of setting up analysis pipelines and creating insight. I like it
for that. But I have folks working for me who write substantial projects in R
and I constantly worry about maintenance and sustainability in the field. It
isn't great for creating a structured and well understood code base.

With respect to the gender politics and the "exclusion" angle, this was a very
real problem 25 and more years ago - one that I (somewhat shamefacedly) must
confess that I innocently profited from. I got the chance to learn Smalltalk,
Objective C, C, plsql and C++ using expensive compilers on expensive
computers. This gave me a substantial leg up, people in India or Africa had no
chance (effectively) of doing this. Now almost everyone can get the
opportunity (I'm not saying with ease or that the opportunity is just there,
but it is possible) to learn Java or Python or C++, MySql, Javascript and many
more valuable useful tools. The question in my mind is why is it that some
people end up learning ones that then aren't too useful to them? I learned php
some years ago for a particular project, it was useful, I will never use it
again.. but why is it that someone learns php and then doesn't learn Python?
Or Ruby? Or Julia?

If you want a job in a company that uses Python why wouldn't you learn it? Why
wouldn't you write a few projects and put them on Github?

