Why are there so many negative comments? Maybe those posters are vastly underestimating how many people that just start out read HN. I think it's a pretty good post to read after something like "X in Y minutes - Python" to get a very quick grasp of what the language is like.
I'm also not ashamed to say that despite having written quite a few LOC of Python I wasn't aware of named slices for some reason and I think they can clear up some chunks of code I have produced (make it more readable)
Extended unpacking was a new one on me. Kind of like a simple version of pattern matching. I think this one will prove useful!
(I've been using python for many years, but not full time. It's the language I use for one-off scripts, small tools that make use of its handy standard library, or prototypes for things I'm going to write in some other language. That's my excuse for the gaps in my knowledge.)
> ... I wasn't aware of named slices for some reason ....
I wasn't either. But I wouldn't call them a hidden feature of the language. A slice object is an object. So you can make it the value of a named variable, just like any other object. Somehow that wasn't obvious, though.
Now I'm wondering how else I might be able to improve my code by giving names to non-obvious things.
Great suggestion. Just searched for him on youtube and a couple of excellent looking talks popped up. Halfway through and this one seems great so far:
https://www.youtube.com/watch?v=OSGv2VnC0go
Python's unpacking is a poor man's pattern matching. I'd really love to see them extend it to support user-defined patterns like Scala's extractors or F#'s active patterns.
Unpacking pretty much does one thing, and it makes that one thing easier and a lot more readable. Combine that with list comprehension (from what I understand C#'s LINQ is similar) and you end up with code that's highly maintainable/readable (as long as you name your variables appropriately, of course).
I believe Python supports pattern matching other than Regex as well.
Python list comprehensions are not lazy, though. If you generate [for x in xrange(1, 10000)] you'll get 10000 elements in your list. My understanding is that LINQ list comprehensions are that.
> I believe Python supports pattern matching other than Regex as well.
That's pattern matching on strings. The kind of pattern matching being discussed here is pattern matching on data types (see [1]).
I'm aware of generator expressions, but the fact that list comprehensions are not lazy can trip people if they are used to other languages with this feature, since they are usually lazy.
Reading through http://en.wikipedia.org/wiki/List_comprehension, it seems that there are roughly equal numbers of languages where "list comprehension" produces strict list as languages where it produces lazy lists.
I think it's a lack of naming convention - the word "list" on its own means "strict list" in some languages and "lazy list" in others.
I don't agree, people are more likely to come from python that to python from a language that has lazy list comprehensions. But regardless, using a language requires that you learn it and I doubt many people think list comp is lazy. Pythonic programming obeys explicit over implicit, which which if you want a lazy version you explicitly use a lazy version.
> Pythonic programming obeys explicit over implicit, which which if you want a lazy version you explicitly use a lazy version.
What does this have to do with anything? If the construct was defined as lazy, then it would be just as explicit. There is nothing about generator expressions that says "here, here, this is lazy".
I've been using Python and Ruby on and off for a couple years (largely because I haven't found the need to use it seriously day job or side projects).
One thing that strikes odd for me is how people describe Python/Ruby are way more readable than Java.
I felt that Python, while more readable than Ruby (because Python uses less symbols), still contain more nifty tricks compare to Java.
It's true that the resulting code is less code but behind that less line of code bugs might linger around because there might be plenty "intents" being hidden deep in the implementation of Python.
The Python way that is touted many times is "explicit is better than implicit" seems to correlate better with the much maligned "Java is too verbose".
Anyhow, the other day I was refreshing my Python skill and learned the default implicit methods that I can override ( those eq, gte, gt, lte, lt) and I wonder how overriding those resulted in less lines of code compare to Java overriding equals, hashCode, and implementing one Comparator method than can return -1, 0, 1 to cover the whole spectrum of gte, gt, lte, (and even equality, given the context).
In isolated offline code, I think Ruby approaches the unreadable - but only because DSLs are fashionable, and you have to understand the library involved to understand the magic that's going on behind the scenes. With sufficient online searching, you can understand things at a surface level, and with enough time with a debugger you can appreciate all the mechanics that are happening, but IMHO it still leaves you with an unpleasant taste in your mouth at the sheer quantity of magic.
I do prefer Ruby to Python though. I much prefer monadic enumerators combined with lambdas to list comprehensions, even though these tend to be eager in Ruby but have the option of being lazy in Python. Itertools helps, but is deeply hampered by Python's overly verbose lambda syntax.
Been thinking about Python vs Ruby lately. I dread reading other people's Ruby code (eg. if there is a problem in a library I'm using) but I don't have that same dread reading Python code. Now if Python had blocks and RubyGems then I would never give Ruby a second look.
What does RubyGems provide that isn't covered by pypy/eggs? I thought ruby and python packaging was approximated feature parity here (which is to say, both leave something to be desired, but work quite well).
But that's the thing y'know... It's not just about DSL but it's also Ruby's syntax that include lots of symbols (of course DSL made the learning curve increase as well).
By symbol I mean the use of tilda, pound sign, ampersand colon.
Seems to me that it's more of personal preference than anything else and this is why for me personally, Ruby is pretty close to Perl.
Somehow I prefer verbose than too terse. I prefer to read the code in words than in mix of words and symbols.
The comparison functions are there to address the comparison for equality of things that don't have an order. So in some sense, they are more explicit (a minute with the docs explains that implementations only need to directly define 2 of them). -1,0,1 style comparisons still exist in Python 2.7, they were dropped in 3.0. Probably one of those things that could have gone either way.
I think comparisons about readability aren't very useful (they are subjective, 'enough' counts), but I sure like the relative terseness of Python (to the point that I lament people who come over and write Java in Python (I don't mean to direct that at you, it's just a thing that happens, where people don't take advantage of things that are very idiomatic and thus clear even when terse)).
Fair point and I learned early on during my college years not to import habit from one language to the other one. Mainly because "idiomatic" tend to depend on the internal implementation of the language.
When did support for dictionary comprehensions make it into 2.x? I could've sworn it didn't used to work, but I just tried it in the shell and sure enough, it does in 2.7.3.
The order of items in the dictionary isn't stable, but the order of .keys() is guaranteed to be the same as the order of .values() (as long as you don't modify the dict in between calling one then the other).
Unpacking is a limited form of what is called destructuring in other languages like Clojure. I would say that, in terms of feature-richness: unpacking < destructuring < pattern matching.
where first-class patterns are increasingly becoming available in some languages which offer pattern matching (in particular, Haskell will have them soon).
I did a little googling, but am finding it difficult to find good clear information - do you have any articles where I can read about first-class patterns?
How do view patterns make patterns "first-class"? To me that means able to manipulate them as values, I don't see how view patterns allow that, they're just syntactic sugar for case expressions.
I don't mean to belittle Python here, I think its a great language and I find its unpacking useful. I'm only trying to demonstrate that the concept can be (and is in some other languages, like Clojure) taken further to make it more useful still.
I think that Python's unpacking allows most, if not all, of Clojures sequence destructuring for tuples and lists. Clojure takes it a bit further, however, by applying it to all sequences. For example, you could do this:
a,b,c = "XYZ"
because strings are also sequences. You can also do something like this (excuse the awkward syntax as I try to express it in pseudo-Python):
a,b : c as d = [1,2,3,4,5]
# a = 1
# b = 2
# c = [3, 4, 5]
# d = [1, 2, 3, 4, 5]
This might not be so useful in python, since the list already is d and c is simply slicing the end from the list, but Clojure allows you to destructure function arguments: (defn foo [[a, b & c :as d]] ...) when passed the above list (foo [1 2 3 4 5]) would bind the variables as shown in the above comments.
Where destructuring really shines, though, is that you can destructure maps (dictionaries in Python) and vectors can also be treated as maps (their keys are the indices), so you can do stuff like this:
a, {[b {:keys [c, d]} :foo}, e = [1, {foo: [2, {c: 3, d: 4}, bar: 9}, 5]
# a = 1
# b = 2
# c = 3
# d = 4
# e = 5
Coming from a long history of languages like BASIC and Pascal, I will bookmark this tutorial. It seems to open up a lot of interesting Python features that were, quite frankly, not always easy to understand when described in plain text, but now seem pretty simple when presented as examples.
I'll also think about the "collection of simple examples" next time I want to document something.
I used to bookmark tutorials like this, but now ...
I just bookmark the HN discussion. Not only does that retain a link to the tutorial, but it also retains a link to a discussion that usually adds value to the tutorial.
I'm assuming that HN discussions remain available for many years. Anyone know for sure if that is/isn't true?
Instapaper has become my bookmarking tool. I don't even really sort it out anymore. I know it should be there when I need it and mostly the actual article is saved.
Yes. And the order of the two for's in the list comprehension was deliberately kept the same as the order of the two for loops in your explicit code, on purpose, for ease of remembering how the former (i.e. list comp) works.
That order is the part that confused me. I expected to read it right-to-left, instead it's left to right for the for statements, then the expression on the left at the end.
I see now that the Python docs explain this very clearly...
A good reference, to be sure, but man, do I resent the term “trick” in programming. It implies a deception, or something clever that you wouldn’t think to look for, like opening a wine bottle with a shoe. These aren’t tricks, they’re (largely) standard library features that you would simply expect to exist. But maybe I’m underestimating the NIH effect.
Hm... Python only supports 1000 recursions? That seems unsafe. It seems like anyone who writes functional-style code will run the risk of a stack overflow.
Believe it or not, it's actually deliberate. And not that uncommon in other languages too. The only way you can realistially have 'unlimited' recursion is to use tail call optimisation, which is deliberaterly not implemented in python:
Essentially, as I understand it, his take is that massive recursion is confusing, and 'unpythonic'. And with the whole 'explicit is better than implicit' thing, I guess there is a point.
Some times recursion is the best/most obvious solution, and then it is a bit annoying to not have TCO, but you can usually work around it.
^Although I think the idea is that you shouldn't need to. Any recursive implementation can either be easily translated to work iteratively, or can be implemented in such a way that 1000 levels of recursion should be more than enough.
For example, 1000 levels of recursion is more than enough to count all the nodes in a binary tree unless the tree is extremely poorly balanced or inordinately large.
No better than PHP? That's more than a little ridiculous. Every language has shortcomings, and there's no silver bullet/perfect multitool that can or should be used for everything.
It's too bad it's apparently not good for... whatever it is you do. It is pretty great for some things, and having spent more than a few years with both Python and PHP, I can say without reservation that there is absolutely no comparison in terms of language quality.
P.G. is right about PHP. This language is better than Python in various ways. For ex. it does not scare people away if you want to sell your project, because everyone and their grandma knows PHP. Most successful web pages (eg. FB) started out with PHP, not Lisp, C++, Python, Java, but PHP.
There are only handful of language runtimes that got rid of GIL entirely. Namely JVM, .NET, and Rubinius. Other languages either use non-native threads or doesn't support threading at all. (E.g. Nodejs has no threading.)
I'm also not ashamed to say that despite having written quite a few LOC of Python I wasn't aware of named slices for some reason and I think they can clear up some chunks of code I have produced (make it more readable)