As with most language comparisons, the author is clearly established on one side of the fence and unwilling to investigate in any meaningful sense.
He begins with a somewhat spurious statement on how code must be arranged, continues with a complete misunderstanding of what most people mean by scope, and then accuses Python of innumerable memory leaks and associated bugs before he even gets through his first set of comparisons. He claims Python dictionaries abuse the term 'immutable', but they most decidedly never make any such claim. You might learn something about Clojure, but take everything he says about Python with a grain of salt.
(Anyone who is interested in concurrent Python should check out multiprocessing, not threading.)
What bugs me about this article is that the solutions he takes for project euler are taken from a website which tries to make Python into a functional language and are needlessly verbose. For instance, the normal way to solve PE#1 is:
print sum([x for x in range(1001) if x%3==0 and x%5==0)]
Quite a bit shorter than the clojure solution.
Further, their "isPalindrome" function involves a "digits_from_num" call, and is almost 10 lines in total. You could shorten that to one if you check pal == pal[::-1].
Essentially, the guy copied code from a website which wasn't really trying to show off how succinct python could be.
The author seems to be strongly biased against python.
I'm no great fan of some of pythons choices but I think if you find x things wrong with python and 0 wrong with clojure that that is fairly indicative that this is not a balanced review.
Not only that, he's wrong - Python is sensitive to indentation, not whitespace. As long as your block is at the right depth, you can have as much whitespace as you want. I regularly format code so similar things line up. This is perfectly valid Python:
def foo():
longvariable = 1
x = 2
Also on the subject of lambdas, this is also valid:
def foo():
def bar(x):
return x + 5
print bar(5)
With a nice clean syntax for nested functions, lack of multi line lambdas is much less of an issue.
To give an example in reply to the Clojure snippet that is now mysteriously missing from the article (to demonstrate something Python users should be jealous of)...
This is perfectly valid in Python:
users = [ # Name ID Age Phone
['John', 5, 37, '555-1234'],
['Bill', 5, 29, '555-4321'],
['Jean', 5, 32, '555-4132'],
]
Edit: Always a good idea to test even the simplest snippets before posting them...
I think that it is not like 'is english better than spanish', I think it can be:
Is English or Spanish a better language to communicate in for a specific purpose, just like is Python or Clojure a better language to achieve a certain goal in.
So, without the context of that goal a comparison is meaningless.
But, let's say we create a certain application (let's say an accounting program, or a spreadheet) in both clojure and python, feature-for-feature identical, user interface identical and so on, then a comparison becomes possible.
How long did it take to complete the project ?
How easy is it to change the program to conform to some new specification ?
How fast does it execute ?
And so on. It all boils down to the old benchmark problem, a benchmark in a vacuum is meaningless, you have to test using real applications.
Then we agree, I think. My point is that it is quite possible, even easy, to describe experiences with a language ("It was easier to communicate in Spanish in Madrid").
But it is rather pointless to debate languages in isolation ("Spanish verb conjugations must lead to more powerful novels", or somesuch).
The same applies to programming languages, I think.
To me it's a mixed bag. It's like a 'price of admission', if you've been conditioned a life long to look for matching braces (or at least begin, end pairs), and if even languages that are strictly speaking markup languages seem to feel the need for delimiting the end of a block in some way you alienate a large number of people before they've even written a single line.
Compound that with the difficulty of passing code snippets using various media and it is easy to see why so many people argue against this.
You can get used to it, sure. But that's not a very strong argument. I think there would have been ways to get out of this.
Especially when programming using tabs you'd better be very careful not to hit a delete by accident while walking near the end of a block of code. That can really mess you up.
edit: also, for that reason alone any discussion about the relative merits of python vs something else first has to get pas the 45% where we talk about the whitespace. And some people will go to extreme lengths to prove its 'goodness' even when clearly it is not only good.
Being objective about something you like is very very hard.
I find C dead easy to read, that's because I've read lots and lots of it. Any other language is harder for me.
edit: on another note, in a delimited language the indentation can be created fully automatic, in python it can not, placing most of the burden of making sure the indentation is correct on the human.
This lack of 'redundancy' has bitten me several times already.
Right, it is subjective, but requiring indentation imposes a higher baseline level of readability: what you see is what you get.
This is important when you're working with more than just your own code. It's like a forced style guideline, and it gets even better when everyone conforms to Python's other optional style guidelines. It reduces friction.
But it increases friction between those that are 'in' and those that are not (yet). And even some of those that are 'in' have their reservations.
I completely understand the reasoning behind it, but the number of times that I've run in to practical problems because of having some kind of delimiter is 0, the number of times that I've run into some kind of trouble with python because of not having delimiters is starting to add up.
This may be a factor of inexperience, time will tell. But it is definitely a new 'source' of bugs, and sources of bugs are in my opinion things that you avoid like the plague when designing a programming language.
Even in 'C' (or C++, or anything that derived it's syntax from those languages) I always found it both wrong and inconsistent to be able to say:
if (a)
b++;
To me that's asking for misery, you could remove that whole if(a) line and the whole thing would remain syntactically correct.
I think of the { } just like I think of guardrails on the highway. They don't make stuff prettier, sometimes you can't look across, but they certainly keep you safer.
True, it is absolutely beyond that. But I think that it would have helped significantly in an earlier stage, I even think that PHP would have been run into the ground.
In the long run that may still happen, there definitely is shift happening.
That's probably an issue, I'm very - very - old fashioned when it comes to editors, I switch systems all the time so I've come to rely on a minimal subset (vi, without any fancy plug ins) to do my work.
This is probably a limiting factor. For years I thought of IDEs as training wheels, I'm beginning to see that in some cases IDEs are more than that.
But I'll never be comfortable with them.
My idea of building an application is to log on to the server, create an svn directory and start hacking away at it. That's probably not the most productive way to get things done these days.
The only language that I work in where I made the concession to use an IDE is Java, simply because there are too many syntactic gotchas that I can't seem to predict ahead of time and the ECT cycle is ridiculously long on the project that I'm working on. So that's where (at least to me) the IDE seems to be worth the fact that I can only work on that stuff when sitting in front of a serious machine.
I'd hate to run intellij on a laptop screen, forget about a netbook.
I think there is a distinction between, say, ed or eclipse (to put it into the extremes). I also dislike IDEs like Eclipse and such (for various reasons, part being a lot of annoyance to get it to run properly on this box, part being slowness, part being a vim-user for a long time), but on the other hand, a vim with some plugins and the regular syntax highlighting and indenting for various langauges is very helpful, without really getting in your way.
I posted this there, but I feel more people prefer to read comments here.
---
First, I love both Python and Clojure, and completely agree that Clojure is more functional and aimed towards concurrency (this was its goal, after all). But I feel like you're doing Python wrong here.
> I think this should make a Pythonist a little jealous
No, I think you misunderstood the whitespace rules for Python. First, it isn't set to "4 spaces", it just has to be uniform. You could use 8 tabs if you wanted, or 2 spaces, or 7 spaces. Also, inside of data structures, strings, function calls, etc you can go whitespace crazy. This is valid Python:
> The last print statement outputs "10" which is surprising to me, since y has never been declared in that given scope.
But it wasn't declared in that scope, there is no scope for 'if'. It may not be what you're used to but you can't blame Python for your assumption of what should have it's own scope or not.
> To resolve it you have to manually call 'del y'
What do you mean to resolve it? Why are you deleting the reference at all? You very rarely see 'del' in real Python code.
> but I can't count the number of C app's that have suffered from memory-leaks and what not, based on the lack of automated garbage collection
What does that have to do with Python? It is reference counted, those values will be deleted when they go out of all scopes, just like in Clojure.
> But this is very different from Clojures concept where immutable means: 'x' never changes.
That isn't what it means in Clojure at all.
user=> (def x 1)
#'user/x
user=> x
1
user=> (def x 5)
#'user/x
user=> x
5
I just changed 'x'. What immutability means is that the value 'x' points to does not change. And that's exactly what it means in Python, too, but dictionaries are mutable (and this is documented). For example, strings in Python are immutable, just as in Clojure. If two things reference the same string and one "changes" it only means that it references a wholly new string.
> Guido is not preparing to transition into modern programming
That's a little extreme. Concurrent programming is valuable, but there will also be many programs that fit single process / thread models very well until the end of time.
The whole sentence regarding an if blocks (lack of) scope seems.. wrong.
> .. since y has never been declared in that given scope. I don't quite know how Pythonists handle this, but I can imagine many situations where this will be the cause of much confusion
Given the following code:
if some_irrelevant_function():
x = 2
else:
x = 5
print x
The author is claiming it would be less surprising for that to raise a NameError exception, complaining x is undefined?
> To resolve it you have to manually call 'del y'
No. You just have to be sane with the usage of your variables. I have never run into a situation where this has been a problem.
> I don't have the numbers for Python, but I can't count the number of C app's that have suffered from memory-leaks and what not, based on the lack of automated garbage collection. With Pythons use of scope, I imagine quite a few bugs will follow.
Err, no.
----
> Now in one Python doc I found they actually refer to this type of data [dictionary] as immutable
Assuming the docs actually say this, they are incorrect and should be amended.. Dictionaries are absolutely mutable.
> Lambda
Err. The complaint here is you cannot use "?" in an identifier...?
> To Pythons credit it actually pinpointed the exact character which caused the exception - You shouldn't except that much help from Clojures backtraces....yet
Not to make this into a "languageX vs languageY" debate, but I'd say having decent error messages is far preferable to being able to use symbols in an identifiers name..
I agree for the most part, but incidentally you would never (def x 1) and then (def x 5) in a running program -- you would use an atom, ref, or var instead. Redefinition of a symbol is permitted so you don't have to quit and restart your REPL every time you rewrite a function.
A more meaningful objection is that you can shadow a local with a different local in Clojure, like this:
(let [a (some-calculation)
b (something-else)
a (some-function-of a b)]
a)
which is essentially what is being done in the Python code.
I don't think there are many strong arguments to make in favor of Clojure over Python other than concurrency and macros, and maybe the immutable hashes.
> you would never (def x 1) and then (def x 5) in a running program
Right, I understand. It was the way the author stated 'x never changes' that made me believe he doesn't understand the difference between binding a name and changing a value.
I like Clojure, but this whole article strikes me as thinly veiled language trolling. It's not a comparison, it's a long blog posting talking about how Lau thinks Clojure is better than Python.
I am willing to give him some benefit of the doubt because I've had interactions with the author in #clojure on irc in the past and I don't think he's got bad intentions. He just has strong opinions about things.
I'm cool with that, but if you have strong opinions pro one language and contra another, and lots of experience in one language and almost none in the other (even I can tell that, and I'm pretty much a newbie to both python and clojure) then you shouldn't make comparisons.
It is a disservice to clojure as well, because it makes people think that clojure proponents are thinking about clojure in terms of religious dogma, instead of rationality.
He is basically dinging Python for setting up some pretty basic functions like ireduce(). Wouldn't you do the same in Clojure if something didn't exist? If its just about have short code, I would suggest the line 'import numpy'.
Also for the concurrency parts, if you are talking strictly about the languages and not their implementation, why not compare Jython to Clojure?
If you read this article you can clearly see the lack of knowledge the author holds in Python language itself.
If you gonna do a ... VS ... you better know both sides pretty damn well otherwise it leads to being a somewhat worthless comparison IMHO.
There is a bug in the fibonacci example: it sums odd numbers instead of even.
from itertools import takewhile
def ifibonacci(a=1, b=1):
while True:
yield a
a, b = b, a + b
candidates = takewhile(lambda n: n < 4000000, ifibonacci())
print sum(n for n in candidates if n % 2 == 0)
He begins with a somewhat spurious statement on how code must be arranged, continues with a complete misunderstanding of what most people mean by scope, and then accuses Python of innumerable memory leaks and associated bugs before he even gets through his first set of comparisons. He claims Python dictionaries abuse the term 'immutable', but they most decidedly never make any such claim. You might learn something about Clojure, but take everything he says about Python with a grain of salt.
(Anyone who is interested in concurrent Python should check out multiprocessing, not threading.)