This list is so deliciously sophomoric. The best one is, and I quote:
"""To many other weirdo bits of magic syntax, like [list comprehensions]"""
Obviously without actually proposing how comprehensions could be made better one has to hope the author would say he likes the equivalent Haskell better, but there is a strong doubt that is not the case.
Complaining about having to cast generators to list... seems like the kind of dev that has their code randomly run out of memory until a senior dev comes and fixes it.
It is annoying! And doesn't always save memory. In theory a "sufficiently smart language" should have a feature to understand that `thing[3]` can be translated to "call `next()` 3 times and gimme the last thing", not in "turn this completely to a list, that may take a ton of memory, although I only need that one third element".
Generators should be treatable as lazy lists in the end, and lists should have a common interface whether they are lazy or eager. Someone should figure out a way to have us write code at a higher abstraction level and have same interface for lazy vs eager data structures.
...but it's not gonna happen in a dynamic language like Python. And I can't say I like the "solution" of having the entire language be lazy like Haskell either :|
We're stuck with "casting generators" for now, I guess, but it really does suck!
You don't need to cast the generator, you can do:
next(itertools.islice(my_generator, n, n+1))
With that said...
> In theory a "sufficiently smart language" should have a feature to understand that `thing[3]` can be translated to "call `next()` 3 times
This might be a newbie trap, because next() isn't the same as indexing. What happens if I perform `thing[7]` followed by `thing[5]`? Should performing `thing[7]` put 1-6 in memory and turn the object into a generator-list hybrid?
> Should performing `thing[7]` put 1-6 in memory and turn the object into a generator-list hybrid
You're right here. Probably can't work like that since generators are too general, you can't expect them to be rewindable or to not have side effects... prob you'd need a more specialized concept like a "lazy list" that would be a subtype of generator with some extra restrictions that would make it possible to implement the "hybrid" structure as an implementation detail without changing semantics.
Anyway... it would be too much work and probably would turn into a footgun.
I personally find list comprehensions in python pretty horrible.
They seem to exist only to do lots of stuff in one single line of code. You end up with totally impenetrable unreadable perl-esq garbage write-once-read-never code that is too clever for its own good. And people say python is easy to learn and good for beginners...!
A better approach would be something like Java Streams/.net Lync/RxX pattern IMO. Explicit, clear, no magic, logical.
You can decompose any list comprehension into it's equivalent for loop fairly easily. I don't see what is so magic about them other than the language having a special compiler level optimization for them?
g = [i for i in list if x == 3] -> g = []; for i in list: if x == 3: g.append(i);
I see people frequently say that list comprehensions are special-cased and highly optimized by the compiler, but I did a few experiments recently with `timeit` and found that using `map()` is on the order of 1.5x faster in all the cases I tested. I think that either winds up being faster than a manual for loop, though
dic = {k: v for k, v in dic.items() if k in other_dic and v == "bar"}
How could that be improved? That's 3-4 LOC minimum in any other language
My main grip is python's ternary operators, since the True value is evaluated before the condition, if you are doing ternaries on things that might throw exceptions the False value has to come first
value = 0 if key not in dic else dic[key] * 5
rather than (throws indexerror if key isn't in dic)
There are a number of precedence rules you need to keep track of to parse the list comprehension. There are two different syntaxes that do the same thing. Arguably you need to do the same if you don't already know how the threading macro in the my second example works.
I posted due to your claim regarding all other languages necessitating increased verbosity. I should have left it lie, as I didn't intend to promote a language war, just to post a counter example. My apologies.
It appears textually first but the evaluation order is the same as the classic ?: ternary. ```[][0] if False else 'foo'``` will not raise an IndexError.
" impenetrable unreadable perl-esq garbage write-once-read-never code " - I had about the same impression when I first saw it. I think ternary operator is as far as I am willing to go ;)
list comprehensions can be elegant and actually improve readability. However, I do agree that it can be easily misused. I have seen many junior Python programmers writing super long, complicated comprehensions that hurt my brain. They think it is pretty cool just because their solutions are one-liners.
List comprehensions are one of those abuse subject features (like most things) where light simple use is fantastic but some people just take it too far into unreadable nastiness.
For example, instead of putting someone's article down as sophomoric, you could explain what's different and possibly better about Haskell list comprehensions.
There's a huge difference between a classroom environment or academic symposium, and an internet forum. Here, when people attack and take swipes at each other, discussion slides downhill so fast that it becomes an existential issue for the forum. Having HN not destroy itself the way internet communities usually do has been the main goal here since pg created the site over a decade ago, and the guidelines here are written with all that experience in mind.
I used to think similarly about this to what you express, because I've always enjoyed reading about the sort of discourse in which devastating wit is exchanged. But eventually I realized that it doesn't translate into this context at all. This is a case of 'the medium is the message'. When you have millions of people who don't know each other all potentially interacting at the same time, the dynamics are so different as to be incommensurable with, say, small debates, literary journals, elite social events, and other places where the groups are small and highly cohesive. Here are some previous explanations about this: