It seems to me that there are 2 types of "Computer Scientist"
One with a more traditional education, including all the tech subjects, plus all the college level math (3 or 4 calculus, 3 or 4 algebras, discrete mathematics, differential equations, etc). I am in this group.
And then the more... I don't know how to call it... "modern education" maybe? which is very light in the math side, and very focus in implementation of the technical side
Computer science is not about computers, nor is a science. But it is about the process of information transformation. While the book shows some of that in Python, a single language is not enough to "Think like a Computer Scientist", what is worse it helps perpetuate a false impression of what a computer scientist do.
I would fall into your latter camp, being self-taugh. I started with the interpreted languages JS, Ruby and Python and, in the end, I didn't feel like I understood what I was doing until I got into C and even Objective-C. Therefore, it wasn't until I touched a pointer or dealt with memory management that I felt like I was really more than a coder.
Not everyone wants to learn C, or C++ for that matter, but at least taking the time to understand I/O and (in my case) POSIX systems goes a long way to writing better programs - even if they are just in Python.
I began as self-taugh like you, but I went from Basic to Assembler in 1 year. I gained a lot by understanding what the compiler was trying to make work, so I agree with you that understanding of lower levels goes a long way into what you produce at a higher level
Then I went to college, and math forced in me a more abstract thinking. I had to make it work in my head, because writing the symbols will not make sense if I could not conceptualize it first. I honestly hated differential equations, but in retrospective, it help me a lot to write functions that create new functions in return, and this is in C# (I don't like the mess that C++ became, but I respect other people that do)
I am a computer scientist, and even though I've used python extensively in both product development as in prototyping,
and I honestly don't think python helps people here.
If you want people to learn to think as a computer scientist via programming languages, teach them at least 1 language in the 3 major paradigms (OO, FP, declarative).
(There's more to learn: data structures, architecture, math, .... that I will not delve into here...)
The problem is that the insight comes at a price: You'll start comparing and might conclude that LanguageX is not suited for taskY.
Python isn't suitable for everything, but very often it's good for at least a prototype.
Well, yes, but you're missing the parent post's point. It's not saying that python is a bad language, that's sorta irreverent. The post uses understanding a language decision itself as an example of "thinking like a computer scientist." No matter how good or bad any single language is, learning to use language is not "thinking like a computer scientist."
OK, I agree. In the computer science sense, I enjoy Python because I find it has a high degree of transparency. That is just a personal judgement though.
It's not bad - I read this book a while ago to boost my Python-fu. But it left me feeling like most software/language books - unsatisfied, i.e. here are the pieces (syntax), here are a few (boring, trivial) ways to put the pieces together, etc. To give a really strong CS foundation, SICP still rules. :)
I do most of my work in python, but the one thing that I use all the time and makes my life easier is types. I was introduced to types via c++ and java and having to declare them explicitly gives (imo) a much more concrete idea of what a program is actually doing. You can learn a little bit about types in python, but their connection to correctness and strange bugs (in python) is hidden by the dynamic type system.
So while I like python a lot and use it as often as I possibly can, I think it hides one of the key elements that distinguishes of thinking like a computer scientist from just thinking logically. Computers are all numbers, types give numbers meaning and provide strong tools for intuition. Python hides that.
I find that I rarely get confused about the types of the things that I'm manipulating, and in practice I find that the problems I have are unexpected behavior from say third-party web APIs, surprising data at system boundaries and so on. Usually I try to mitigate this by introducing a "type system" (or more accurately, declarative data validation enforced at runtime) near the boundary of the application, such as a parser, JSON schema, option parser, etc.
The biggest thing I do miss in Python is solid and natural concurrency. Gevent is the closest I've found, but it doesn't play as nicely as I would like.
It seems to me that there are 2 types of "Computer Scientist"
One with a more traditional education, including all the tech subjects, plus all the college level math (3 or 4 calculus, 3 or 4 algebras, discrete mathematics, differential equations, etc). I am in this group.
And then the more... I don't know how to call it... "modern education" maybe? which is very light in the math side, and very focus in implementation of the technical side
Computer science is not about computers, nor is a science. But it is about the process of information transformation. While the book shows some of that in Python, a single language is not enough to "Think like a Computer Scientist", what is worse it helps perpetuate a false impression of what a computer scientist do.
Applied Python would be a better title