Hacker News new | past | comments | ask | show | jobs | submit login

Python looks more and more foreign with each release. I'm not sure what happened after 3.3 but it seems like the whole philosophy of "pythonic", emphasizing simplicity, readability and "only one straightforward way to do it" is rapidly disappearing.

“I've come up with a set of rules that describe our reactions to technologies:

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you're thirty-five is against the natural order of things.”

― Douglas Adams, The Salmon of Doubt

It's wrong to frame this as resistance to change for no reason. See my other comment. I see some of this stuff as repeating mistakes that were made in the design of Perl. ...but there are quite few people around these days who know Perl well enough to recognize the way in which history is repeating itself, and that has at least something to do with age.

"resistance-to-change for-no-reason" vs "resistance-to change-for-no-reason" :)

This is possibly the best example of the ambiguity of language I've ever seen. Two contradictory meanings expressed in the exact same phrase, and both of them are valid in the broader context.

Jeez. What number of people who read the same phrase with either of those two meanings then continue to form opinions and even make decisions based on the resulting meaning?

Me, I am old enough to know Perl, and I've got plenty of one-line skeletons in my own closet. And it more-or-less entered the world already vastly more TMTOWTDI-y than Python is after 3 decades.

FWIW, I tend to think of comparisons to Perl as being a lot like Nazi comparisons, only for programming languages. And I do think there's some wisdom to the Godwin's Law idea that the first person to make a Nazi comparison is understood to have automatically lost the argument.

It's just that, at this point, Perl is both so near-universally reviled, and so well-understood to be way too optimized for code golf, that any comparison involving it is kind of a conversation-killer. As soon as it shows up, your best options are to either quietly ignore the statement in which the comparison was made, or join in escalating things into a flamewar.

I wouldn't call it reviled. Perl makes for a poor general purpose programming language, it always did. You can write an HTTP server in Perl but you probably shouldn't. It's very good for what it was always intended for, those situations where you need to process some data, but like just once not every week for the rest of eternity.

I've never regretted a Perl program that I wrote, used and discarded. And I've never been content with a Perl program I found I was still using a week after I wrote it.

The point #1 is expanded on in Feral by George Monbiot. Basically, we have a tendency to see the outside world we grew up with as the way things naturally should be, ignoring that previous generations may have changed it to be that way. That sheep-grazed pastoral landscape is easy to view as a thing worth preserving, but to an ecologist it might be a barren waste where there used to be a beautiful forest.

Forewarned is forearmed. I headed into adulthood watching out for such mirages. For example: Making sure to listen to pop music enough that it does exactly what pop music is supposed to do (worm its way into your subconscious) so I don't wake up one morning unaccountably believing that Kylie Minogue was good but Taylor Swift isn't.

My understanding of Python will probably never be quite as good as my understanding of C, but I can live with that.

How do you know to listen to Taylor Swift or whatever? In the last century it was easy to be in sync: you could just watch MTV. Is there something keeping the notion of pop coherent these days?

Not exactly pop, but there are some great weekly music podcasts that I listen to to hear new music, which tend to be a little more indie pop/rock/${genre} than pop :)

- Music That Matters: https://omny.fm/shows/kexp-presents-music-that-matters/playl...

- KEXP Song of the Day: https://omny.fm/shows/kexp-song-of-the-day

- All Songs Considered: https://www.npr.org/rss/podcast.php?id=510019

- KCRW Today's Top Tune: https://www.kcrw.com/music/shows/todays-top-tune/rss.xml

Apple/Google Music or Spotify or Pandora all have pop playlists that play the current top 100 songs on rotation. The Billboard Hot 100 also lists popular western music if you just want a list to review on your own.

I’d argue it’s easier now to stay in sync than even when MTV was popular. MTV you needed a cable subscription and be sitting at a TV, now SiriusXM or Apple/Google/Spotify can stream it right to your phone laptop or tablet, and regular FM radio will play it on the local Top 40 station.

I don't think it's just >35-year-olds who find what's going on in Python against the natural order of things?

I'm 34 and I don't like this, so it's definitely not only those above 35. Jokes aside, I would say I'm a minimalist and this is where my resistance comes from. One of the things that I dislike the most in programming is feature creep. I prefer smaller languages. I like the idea of having a more minimal feature set that doesn't change very much. In a language with less features, you might have to write slightly more code, but the code you write will be more readable to everyone else. Said language should also be easier to learn.

IMO, the more complex a system is, the more fragile it tends to become. The same is true for programming languages. Features will clash with each other. You'll end up having 10 different ways to achieve the same thing, and it won't be obvious which direction to go.

Furthermore, I did my grad studies in compilers. I've thought about writing an optimizing JIT for Python. I really feel like CPython is needlessly slow, and it's kind of embarassing, in an age where single-core performance is reaching a plateau, to waste so many CPU cycles interpreting a language. We have the technology to do much better. However, the fact that Python is a fast moving target makes it very difficult to catch up. If Python were a smaller, more stable language, this wouldn't be so difficult.

> In a language with less features, you might have to write slightly more code, but the code you write will be more readable to everyone else.

I disagree with this, which is precisely why I prefer feature rich languages like Java or better yet Kotlin. It doesn't get much more readable than something like:

    .filter { it.lastName.startsWith("S") }
    .sortedBy { it.lastName }
Now try writing that in Go or Python and compare the readability.

Python is a little more readable, but both Python and Kotlin are perfectly clear in this case:

    sorted((u for u in users
           if u.last_name.startswith("S")),
           key=lambda u: u.last_name
If last_name is a function, which it often would be in Python, it gets better:

    sorted((u for u in users
           if last_name(u).startswith("S")),
However, I think you probably got the sort key wrong if you're taking the first three items of the result. Maybe you meant key=abuse_score, reverse=True, or something.

I disagree this python version is as readable and here’s why. It’s about as many characters but more complex. The Kotlin version performs several distinct actions, each being clear to its purpose. These actions have the same syntax (eg requires less parsing effort). The Python version mixes at least 4 different language syntax/features, being list comprehension, if special form in the list comprehension, keywords, and lambda functions.

On top of the lessened readability, the Kotlin version makes it very easy to add, subtract, or comment out lines/actions which really helps when debugging. The Kotlin version is almost identical in structure to how you’d do it in Rust, Elixir, etc.

I agree. I don't know Kotlin and am reasonably well versed in Python, yet I immediately grasp the Kotlin example as more readable, while having to squint at the Python one for a few seconds. (this is anecdotal of course, and does not account for the example possibly being contrived)

One thing that I like more in the Python version is that it contains less names: .asSequence and .take are replaced by operators of much greater generality, while the ugly implicitly declared identifier it is replaced by explicitly deciding that sequence elements are u.

It should also be noted that Python would allow a more functional style, possibly leaving out the list comprehension.

It's surprising to me that there are people who disagree with my opinion about this, but it suggests that my familiarity with Python has damaged my perspective. You're clearly much less familiar with Python (this code doesn't contain any list comprehensions, for example), so I think your opinion about readability is probably a lot more objective than mine.

FWIW most of the programming I've ever done has been in Python, and while I have no trouble understanding either snippet, I think that the Kotlin snippet is much clearer in intent and structure.

I certainly didn't mean to imply that only someone unfamiliar with Python could prefer the Kotlin version! Perhaps you thought I meant that, but I didn't.

> this code doesn't contain any list comprehensions, for example

It does contain a generator expression though, which is the same as a list comprehension in general structure, but slightly more confusing because it doesn't have the relationship to lists that square brackets in a list comprehension would have given it.

Yes, it shares the structure of a list comprehension, but has different semantics. In this case a listcomp would have worked just as well.

My point, though, was that not being able to tell the difference was a key "tell" that the comment author was not very familiar with Python — in some contexts, that would tend to undermine credibility in their comment (and then it would be rude to point it out), but in this context, it probably makes their opinion more objective.

Good point, though it's less my familiarity with Python and more that I tend to simplify and call generator expressions as list comprehensions unless the laziness is important to call out (meta laziness there? ;) ). Mainly since L.C.'s were first and describing the differences is tedious.

I think you're all fighting for nothing here.

The map filter chaining is obviously simpler, but python code is not that difficult and it's a no brainer task anyway.

It's true the Python is still relatively easy. It may only take, say, 1.3 sec vs 1.1 to parse, but it adds up.

This isn't very readable at all and certainly not any more readable than a chain of method calls, being that you've spread the operations out in different places. It's not even syntactically obvious what the `key` argument is passed to if one doesn't know that `sorted` takes it. None of those problems exist when piping through normal functions or chaining method calls.

Python is for the most part overrated when it comes to these things, IMO. It's a nice enough language but it's aged badly and has an undeserved reputation for concision, readability and being "simple".

C# supports both conventions (in LINQ) - I mean the Kotlin one from the grandparent comment, and the Python's from parent's.

The method chaining syntax and the query syntax are alternatives. I think most devs lean towards the former, considered to be cleaner... whereas the latter is probably easier to learn in the beginning, to those unfamiliar with piping/functional style - owing to its SQL feel-alikeness.

ReSharper would offer converting the latter to the former, and that's how I learned method-chaining LINQ back in the day.

A little off-topic but how does that work? Is 'it' a magic variable referring to the first argument? Never seen magic variables that blend into lambdas like that before... would've expected $1 or something like that.

The idea of anaphoric macros[1] is first found in Paul Graham's "On Lisp"[2] and is based on the linguistic concept of anaphora, an expression whose meaning depends on the meaning of another expression in its context. An anaphor (like "it") is such a referring term.

I think if you like this idea, you will really like the book. Better still, you can download the pdf for free.

[1] https://en.wikipedia.org/wiki/Anaphoric_macro [2] http://www.paulgraham.com/onlisp.html

Inside of any lambda that takes a single parameter you can refer to the parameter as 'it'. If you prefer to name your parameters you can do so as well, it's just slightly more verbose:

    .filter { user -> 
    .sortedBy { user -> 

Yeah, a bit of PG’s Arc influence in the wild.

I believe groovy made this popular rather than arc, and it's likely where kotlin's come from, due to being in the java ecosystem.

Most apl deviatives (j, k, q, a) all had implicit arguments for functions that didn't explicitly declare them (up to 3: x, y, and z).

Probably before then too.

Dyalog APL too, but none of them call the implicit argument "it".

Groovy is from 2003. PG keynoted PyCon in 2003 talking about his progress on Arc: http://www.paulgraham.com/hundred.html. He had been talking about Arc online for a couple of years at that point, including in particular the convenience of "anaphoric macros" that defined the identifier "it" as an implicit argument.

(He'd also written about that more at length in the 1990s in On Lisp, but many more people became acquainted with his language-design ideas in the 2001–2003 period, thanks to Lightweight Languages and his increasingly popular series of essays.)

But surely Perl's $_ was way more influential than an obscure PG talk. I was reading PG way back in 2004, and I had never heard of anaphoric macros until now.

Wait, you think that, in the context of programming language design, a PyCon keynote is an obscure talk? I don't know what to say about that. It might be possible for you to be more wrong, but it would be very challenging.

Anyway, I'm talking specifically about the use of the identifier "it" in Kotlin, not implicitly or contextually defined identifiers in general, which are indeed a much more widespread concept, embracing Perl's $_ and @_, awk's $0 (and for that matter $1 and $fieldnumber and so on), Dyalog APL's α and ω, Smalltalk's "self", C++'s "this", dynamically-scoped variables in general, and for that matter de Bruijn numbering.

> a PyCon keynote is an obscure talk

Compared to the existence of Perl, yes. Anyone who does any amount of Perl learns that $_ is the implicit argument ("like 'it'") to most functions. It's pretty much one of Perl's main deals. The talk has about 100K views on YouTube, which is pretty good, but Perl is in another league.

Too bad Apache Groovy itself didn't remain popular after popularizing the name "it" for the much older idea of contextually-defined pronouns in programming languages. Using the names of pronouns in English (like "this" and "it") is easier for an English-speaking programmer to understand than symbols like "$1" or "_". But because of Groovy's bad project management, another programming language (Kotlin) is becoming widely known for introducing the "it" name.

Pretty sure the Go community will be fine with not being feature rich, since simplicity, maintainability and getting new people up to speed matter more for them.

The go community have gone to far the other way for me, the endless repetition introduces its own complexity.

Simple core languages that are syntactically extensible with libraries have the best of both worlds: https://vvvvalvalval.github.io/posts/2018-01-06-so-yeah-abou...

sorted(u for u in users if u.last_name.startswith("S"), key=lambda u: u.last_name)[:3]

Though I will conceed that I also find the fluent interface variant nicer.

That doesn't parse :-)

You’re doing it wrong :)

  users.apply {
      filter { it.lastName.startsWith("S")
      sortedBy { it.lastName }
(totally untested)

Furthermore, I did my grad studies in compilers. I've thought about writing an optimizing JIT for Python. I really feel like CPython is needlessly slow, and it's kind of embarassing,

Many have tried and failed, Google and Dropbox to name a couple, and countless other attempts.

It lags a bit in releases, but I understood pypy to be essentially successful?

Yes, PyPy is fantastic for long-running processes that aren't primarily wrappers around C code. In my experience, the speedups you see in its benchmarks translate to the real world very well.

Yes, and part of the reason they failed is the reason I pointed to: Python is a fast moving target, with an increasing number of features.

It's not the new features of Python that make it hard to optimize; it's the fundamental dynamic nature of the language that was there from day one. Syntactic sugar doesn't have an impact one way or the other on optimizing Python.

The new features aren't just syntactic, they're also new libraries that come standard with CPython, etc. If you want to implement a Python JIT that people will use, you have to match everything CPython supports. Furthermore, since the people behind CPython don't care about JIT, you also can't count on them not adding language features that will break optimizations present in your JIT. You can't count on these being just "syntactic sugar". Even if you could though, in order to keep up it means you have to use CPython's front-end, or constantly implement every syntactic tweak CPython does.

Lastly, AFAIK, CPython's FFI API is actually more of a problem than the dynamic semantics of the language. You can expose Python objects directly to C code. That makes it very hard for a JIT to represent said Python objects in an efficient way internally.

> In a language with less features, you might have to write slightly more code, but the code you write will be more readable to everyone else.

That's not universally true. C# has more features than Java but is generally easier to read and the intent of the code is easier to follow. The lack of features, like properties or unsigned integers, leads to Java coders creating much more convoluted solutions.

If languages with less features were universally better we would all be using C and BASIC for everything.

I think the importance is the orthogonality of the features. Eg. having so many ways to do string formatting or now multiple ways of doing assigments are not ortogonal and thus can be seen as cluttering.

I'm 38, and I'm fine with these changes, and ive been using Python for +15 years.

I can plainly see how these changes will actually make my code cleaner and more obvious while saving me keystrokes.

I also don't think these changes are very drastic. They're opt-in, doesn't break anything and looks to lead to cleaner code. I love the walrus operator (not so sure about the name, but hey. C++ is getting the spaceship operator... As has been said, naming things is hard). To me, the change of print from a statement to a function has been the hardest Python chamge over the years. Just too much mental momentum. Even though ive been on Python 3 for years, I still make the mistake of trying to use it as a statement. That said, I think it was the right (if painful) move.

I don't speak for everyone over 35, just myself.

theory : age itself with regards to computing has nothing to do with how old you act (with regards to computing), the time you spent doing a specific thing is what grows that 'characteristic'.

Anecdote : i'm fairly young, but i've been involved with python long enough and traveled to enough pycons to be a bit jaded with regards to change within the language.

I'm fairly certain it's only due to the additional cognitive load that's thrust upon me when I must learn a new nuance to a skill that I already considered myself proficient at.

in other words : i'm resistant to change because i'm lazy, and because it (the language, and the way I did things previously) works for me. Both reasons are selfish and invalid, to a degree.

Conversely, some of us oldsters think the outrage is way overblown.

No, those aren't really the reasons for my reaction. And if I told you my age, you would probably switch your argument and say that I'm far too young to criticize ;)

I am an example which supports this notion. I've done some Python programming about 10 years ago but then took a break from programming altogether for the last 9 years. Last year I got back into it and have been using Python 3.7, and I personally love all the most recent stuff. I hate having to go back to 3.5 or even 3.6, and I end up pulling in stuff from futures.

This 'resistance to change' catchall argument puts everything beyond criticism, and it can be used/abused in every case of criticism. It seeks to reframe 'change' from a neutral word - change can be good or bad - to a positive instead of focusing on the specifics.

Anyone making this argument should be prepared to to accept every single criticism they make in their life moving forward can be framed as 'their resistance to change'.

This kind of personalization of specific criticism is disingenuous and political and has usually been used as a PR strategy to push through unpopular decisions. Better to respond to specific criticisms than reach for a generic emotional argument that seeks to delegitimize scrutiny and criticism.

True, but this was not “specific criticism”. It was a general dismissing criticism without details, and so can be refuted with a similarly detail-less answer. A detailed criticism deserves a reasoned and detailed answer, but vague criticism gets a generic rebuttal.

Does that mean someone born in 2008 will think C++ is simple and elegant?

I am both a Python programmer and a C++ programmer. I have programmed professionally full time in one or the other for years at a time. I think C++ is now a much better language than when I learnt it first (cfront). In particular C++11 really fixed a lot of the memory issues with shared_ptr and std:: algorithms. It is a better language now if you are doing anything larger than then a program that takes more than a few weeks to write. On the other hand, I love python for everything else and some of the new stuff is great but making a new way to print strings over and over tells me some people have too much spare time or not enough real work to do. In my opinion formatting a string to print a debug statement should be as concise as possible whereas a lot of these fancier formatting systems are better suited to stuff that ends up staying for use by other people. Luckily there are ways to use ye olde printf style formatters in both for those times.

C++ might be "better" now (I doubt it, to be honest, it just has more features that try to fix the issue at hand; that you're using C++), but it will never, ever get simpler or simple enough. They'd have to remove something like 75% of the language to end up with something that approaches simplicity and even then there are languages that would undoubtedly do those remaining 25% much better.

I stopped writing C++ at some point in 2008/2009 but I still keep track of it to some extent and I'm continually surprised by the nonsense that is introduced into the language. The whole RAII movement, for example, is just one massive band-aid on top of the previous mistake of allowing exceptions, etc..

It'd be mostly fine in the long run, but you have all these people using like 15% of C++ and complain about it all day long, making their libraries not usable from stuff that understands C (most of which have drastically improved on the whole paradigm). There's a solution here and it's not using whichever arbitrary percentage you've decided on of C++, it's realizing that there are way better languages with real interoperability in mind to talk about lower-level things.

No, the claim is that it's ordinary and just part of the way the world works.

Good point. I think it should be rephrased in basis of personal familiarity: people who learned C++ before they were 15 indeed think that it's simple and elegant.

Except that Python existed before I was born and I still appreciate the concept of 'Pythonic'. The language should stay true to its roots.

* Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.*

Yep, I entered the Python world with v2. I eventually reconciled myself to 2.7, and have only recently and begrudgingly embraced 3. Being over 35, I must be incredibly open minded on these things.

Can you give an example of something like this happening to the language? IMO 3.6+ brought many positive additions to the language, which I also think are needed as its audience grows and its use cases expand accordingly.

The walrus operator makes while loops easier to read, write and reason about.

Type annotations were a necessary and IMO delightful addition to the language as people started writing bigger production code bases in Python.

Data classes solve a lot of problems, although with the existence of the attrs library I'm not sure we needed them in the standard library as well.

Async maybe was poorly designed, but I certainly wouldn't complain about its existence in the language.

F strings are %-based interpolation done right, and the sooner the latter are relegated to "backward compatibility only" status the better. They are also more visually consistent with format strings.

Positional-only arguments have always been in the language; now users can actually use this feature without writing C code.

All of the stuff feels very Pythonic to me. Maybe I would have preferred "do/while" instead of the walrus but I'm not going to obsess over one operator.

So what else is there to complain about? Dictionary comprehension? I don't see added complexity here, I see a few specific tools that make the language more expressive, and that you are free to ignore in your own projects if they aren't to your taste.

> F strings are %-based interpolation done right, and the sooner the latter are relegated to "backward compatibility only" status the better. They are also more visually consistent with format strings.

No, f-strings handle a subset of %-based interpolation. They're nice and convenient but e.g. completely unusable for translatable resources (so is str.format incidentally).

What makes % better than .format for translations (and isn't something like Django's _(str) better anyway?

F strings are obviously non-lazy, but _(tmpl).format(_(part)) seems fine?

`.format` lets you dereference arbitrary attributes and indices (I don't think it lets you call methods though), meaning you can run code and exfiltrate data through translated strings if they're not extremely carefully reviewed, which they often are not.

% only lets you format the values you're given.

> and isn't something like Django's _(str) better anyway

They're orthogonal. You apply string formatting after you get the translated pattern string from gettext. In fact, Django's own documentation demonstrates this:

    def my_view(request, m, d):
        output = _('Today is %(month)s %(day)s.') % {'month': m, 'day': d}
        return HttpResponse(output)

What would "do/while" look like in Python? Since blocks don't have end markers (e.g. "end", "}", etc.) there's nowhere to put the while expression if you want the syntax to be consistent with the rest of the language.

One solution would be to borrow from Perl. You make a do block that executes once unless continued, and allow conditions on break and continue:

        continue if condition
And you can now express "skip ahead" with a `break if X` as well.

Yes, although you don't have to be so perlish as to do the if in that order

        if condition:

I envisioned it like "if/else" or "for/else" or "while/else", where a "do" block must be followed by a "while" block.

    x = 0
        x += 1
        x < 10

This completely contradicts the rest of Python grammar, and indeed many languages’ grammars. The consistent way would then be `while x < 10` but that too looks ridiculous. The issue is that you can’t have post-clause syntax in Python due to its infamous spacing-is-syntax idea.

I'm not sure why the consistent way looks ridiculous.

    while x < 10
It's just a compound statement consumes the trailing while clause.

Decorators already precede a function (or class) definition[2], and one alternative for the ill-fated switch statement[1] was to have switch precede the case blocks to avoid excessive indentation.

So there's plenty of precedent in the other direction.

[1]: https://www.python.org/dev/peps/pep-3103/#alternative-3

[2]: https://docs.python.org/3/reference/grammar.html

I think you're really stretching it when you say "there's plenty of precedent," arguably there is none as the decorator syntax is pre-clause and thus poses no indentation reading issues. So too for the proposed switch statement syntax. Then there is the fact that the decorator syntax is perhaps the most alien of all Python syntax, sometimes criticized for being Perlesque, perish the thought (on account of it being introduced by creative interpretation of a single special character though, so perhaps unrelated.)

My main gripe is the indentation. Your code reads as if the while condition is tested after the loop finishes. What if the while statement was part of the loop and could be placed arbitrarily?

        while x < 10
IOW `do:` translates to `while True:` and `while x` to `if not x: break`.

Addendum: I would also entertain forcing the `while` to be at the end of the loop -- as I'm not sure what this would do

        if foo():
            while x < 10

I think it's precedent because it's just a line after a block instead of before it. It certainly is a break from Python's "big outline" look.

> What if the while statement was part of the loop and could be placed arbitrarily?

If you're open to that, I had thought this was a bridge too far, but:

        break if some_condition
        continue if some_other_condition
Under that scheme, the semantics translate to:

    while True:
And, of course, the `break if` and `continue if` syntax would be general.

Of course you can have post-clause syntax: if...else, try...except, for...else, etc.

(Edit: Actually, I think I know what you were saying now, and those aren't quite the same thing as they need a line after them.)

I do think the condition on the next line isn't the way to do solve this problem though (and I don't think it needs solving, while True: ... if ...: break does the job).

Why does `while x < 10` look ridiculous? It looks exactly like the syntax for regular while loops, just in this case it's after a `do:` block. And the example above yours looks like try/catch syntax, but tbh I like the one you suggested a bit more.

You're right, it would be pretty weird to rely on implicitly "returning" a value from an expression like that.

But I don't think having it all on one line would be that bad.

Most code still look like traditional Python. Just like meta programming or monkey patching, the new features are used sparingly by the community. Even the less controversial type hints are here on maybe 10 percent of the code out there.

It's all about the culture. And Python culture has been protecting us from abuses for 20 years, while allowing to have cool toys.

Besides, in that release (and even the previous one), appart from the walrus operator that I predict will be used with moderation, I don't see any alien looking stuff. This kind of evolution speed is quite conservative IMO.

Whatever you do, there there always will be people complaining I guess. After all, I also hear all the time that Python doesn't change fast enough, or lack some black magic from functional languages.

> Even the less controversial type hints are here on maybe 10 percent of the code out there.

I think this metric is grossly overestimated. Or your scope for "out there" is considering some smaller subset of python code than what I'm imagining.

I think the evolution of the language is a great thing and I like the idea of the type hints too. But I don't think most folks capitalize on this yet.

I mean 10% of new code for which type hints are a proper use case, so mostly libs, and targeting Python 3.5+.

Of course, in a world of Python 2.7 still being a large code base and Python being used a lot for scripting, this will far from the truth for the entire ecosystem.

The idea that types are hostile to scripting sounds really weird to me. Turtle[0] in Haskell is absolutely amazing for scripting -- especially if you pair it with Stack (with its shebang support) -- and it is as strongly typed as Haskell.

There is a bit of learning curve (because, well, it's not shell which is what most people are used to), and you do have to please the compiler before being able to run your script, but OTOH, you'll basically never have that "oops, I just deleted my working directory because I used a bad $VARIABLE" experience.

[0] http://hackage.haskell.org/package/turtle

What's an example of black magic from functional languages?

If you complained more specifically it would be possible to discuss. For what was described in article I don't see anything "foreign". Python was always about increasing code readability and those improvements are aligning well with this philosophy.

i've been hearing this since 1.5 => 2.0 (list comprehensions), then 2.2 (new object model), 2.4 (decorators)...

happy python programmer since 1.5, currently maintaining a code base in 3.7, happy about 3.8.

I cut my teeth on 2.2-2.4 and remember getting my hand slapped when 2.4 landed and I used a decorator for the first time.

It was to allow only certain HTTP verbs on a controller function. A pattern adopted by most Python web frameworks today.

That's especially funny given how everybody screams "that's not pythonic!!1!" nowadays when somebody does _not_ use a list comprehension...

The '*' and '/' in function parameter lists for positional/keyword arguments look particularly ugly and unintuitive to me. More magic symbols to memorize or look up.

I also cannot honestly think of a case where I want that behaviour.

The "pow" example looks more like a case where the C side should be fixed.

> I also cannot honestly think of a case where I want that behaviour.

There's plenty of situations where a named argument does not help, and encoding it can only hurt. It makes little to no sense to name the first argument to `dict.update` for instance. Or the argument to `ord`.

That, incidentally, is why Swift added support for positional-only parameters (though it has no concept of keyword-or-positional).

> That, incidentally, is why Swift added support for positional-only parameters (though it has no concept of keyword-or-positional).

Swift's syntax is a lot more intuitive and consistent:

    function(parameterWithImplicitlyRequiredLabel: Int,
             differentArgumentLabel internalParameterName: Int,
             _ parameterWithoutLabel: Int, 
             variadicParameterWithLabel: Int...)
which you would call as

    function(parameterWithImplicitlyRequiredLabel: 1, differentArgumentLabel: 2, 3, variadicParameterWithLabel: 4, 5, 6, 7)
[0] https://docs.swift.org/swift-book/LanguageGuide/Functions.ht...

It does not help but doesn't hurt enough to grant a special syntax to avoid it.

Yes, it limits your ability to rename a local variable, but that seems minor.

Or where the method should be exposed into several different methods.

Beyond the older-than-35 reason, I think a lot of folks are used to the rate of new features because there was a 5 year period where everyone was on 2.7 while the new stuff landed in 3.x, and 3.x wasn't ready for deployment.

In reality, the 2.x releases had a lot of significant changes. Of the top of my head, context managers, a new OOP/multiple inheritance model, and division operator changes, and lots of new modules.

It sucks that one's language is on the upgrade treadmill like everything else, but language design is hard, and we keep coming up with new cool things to put in it.

I don't know about Python 3.8, but Python 3.7 is absolutely amazing. It is the result of 2 decades of slogging along, improving bit by bit, and I hope that continues.

In my experience, every technology focused on building a "simple" alternative to a long-established "complex" technology is doomed to discover exactly _why_ the other one became "complex." Also spawn at least five "simple" alternatives.

Doesn't mean nothing good comes out of them, and if it's simplicity that motivates people then eh, I'll take it, but gosh darn the cycle is a bit grating by now.

Could you provide some examples? Without having had that experience, I’m having trouble picturing a concrete example that I would be sure is of the same kind.

Nginx is probably my fav of the surviving-and-thriving ones. It still remains very distinct from Apache, but calling it simpler would be a large stretch.

Projects like qmail discovered the reason in a somewhat _harder_ manner. And yes, I'd argue Python is yet another case, as it grew _at least_ as complex as Perl.

Haha, what was that quote? Something like, any language is going to iterate towards a crappy version of lisp.

Greenspun's Tenth Rule[0]

Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp. - Philip Greenspun

[0] https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule

How would you subvert Greenspun in large codebases without Common Lisp? I once used Drools the rules engine which used a dynamic scripting language on Java objects. Python could have replaced that language, with much better tooling, errors etc.

Could you have written that system in a mix of Java and or another scripting language such as JRuby[0]?

[0] http://wiki.c2.com/?AlternateHardAndSoftLayers

IIRC MVEL language was integrated deeply into Drools. JRuby would have been awesome.

I'm working on a language with a focus on simplicity and "only one way to do it": https://vlang.io

The development has been going quite well:


Really interesting. For the skeptics, this is not just a proof of concept. There is a real app made using this language: https://volt-app.com/

and the REPL only leaks 1MB [1] to compile and run a hello world program.

1: https://github.com/vlang/v/issues/514

It doesn't anymore.

There are lots of issues that are being fixed. Strange nitpicking on alpha software.

This is great! Thanks for your work. Can V be integrated into existing c++ projects? I work in audio and constantly working in c++ is tiring. I'd love to work in something like V and transpile down.

Thanks! Absolutely. Calling V code is as simple as calling C (V can emit C).

>I'm working on a language with a focus on simplicity and "only one way to do it":

If I wanted a language with "only one way to do it", i'd use Brainfuck. Which, btw, is very easy to learn, well documented, and the same source code runs on many, many platforms.

I see what you're saying, but I kinda like the gets ":=" operator.

But now there are two ways to do assignment. That's not very pythonic, is it?

You think that's bad? Check out:

    a = 17
    print("a=", a)
    print("a=" + str(a))
    print("a=%s" % a)
    # python 3.8 =>
So many ways to do it...

But, if it sounds like I agree with you, I actually don't. I feel that the Zen of Python has taken on an almost religious level of veneration in people's minds, and leads to all sorts of unproductive debates. One person can latch onto "there should be one obvious way to do it" and another onto "practicality beats purity" and another onto "readability counts." Who's right? All can be. Or none. All could be applied to this particular case.

The Zen of Python is just a set of rough heuristics, and no heuristic or principle in the field of software development applies 100% of the time, IMHO. <= except for this one ;)

> there should be one obvious way to do it

In cases like this, different ways to do it (all equally good) are needed to get a good coverage of different tastes in obviousness and different nuances in the task.

The point is not uniformity, but avoiding the unpleasant and convoluted workarounds caused by non-obviousness (thus making the language easy to use).

String formatting is not trivial: there is the split (sometimes architectural, sometimes of taste, sometimes of emphasis) between concatenating string pieces, applying a format to objects, setting variable parts in templates, and other points of view; and there is a variety of different needs (cheap and convenient printing of data, dealing with special cases, complex templates...)

And there was also:

    print string.Template("a=$a").substitute(a=a)

I never felt like there was only one way to do something in Python. Every Stack Overflow question has a multitude of answers ranging from imperative to functional style and with various benefits and drawbacks.

Python is one of the least "only one way to do things" languages I've used. This even extends to its packaging system, where you can choose between virtualenv, pipenv, pyenv, etc. Same goes for the installation method too, do you want to install Python with Anaconda or use the default method?

As for my personal take on this feature: I think it's really useful. When web-scraping in Python, I oftentimes had to do this:

  while True:
      html_element = scrape_element()
      if html_element:
Now I can do this:

  while not html_element := scrape_element():

Prior to that you could use the special two-argument version of the `iter` function which makes it act completely different than the single argument version:

    for html_element in iter(scrape_element, None):
this calls scrape_element() until it returns None, returning each value.

It used to be more or less true in the early days. For me the "one obvious way to do it" ship has sailed with list comprehensions which was introduced in 2.0 (released in 2000)

Packaging isn’t really anything to do with the language syntax, or the zen of Python. Any critiques on Python-the-language?

And pyenv is just a version manager, like rbenv or nvm. I wouldn’t consider its existence confusing, not would I say being able to install something in more than 1 way has any relevance to the zen of Python!

Should Python create some cross-platform Uber-installer so that there is only one download link?

I don't see why the "zen of Python" shouldn't be applied to its tools too. Tools are part of the developer experience and few/none of the statements/guidelines in the zen of Python are exclusive to Python the programming language.

Regardless of what pyenv, the rest of my comment about the complexity of Python's tooling still stands. There's too many choices. I also seen people use pyenv as an alternative to virtualenvs, which is something I have never seen with nvm.

I don't understand why the Python community hasn't coalesced around a single solution to package management that has minimal complexity. It seems like pipenv is the solution, but there is controversy around it and it should have come several years ago. The fact that Python packages are installed globally by default is also pretty terrible, I much prefer it when applications bundle their dependencies. When I do `npm install --global`, the resulting program will always work, regardless of what other packages I have installed on my system.

> Any critiques on Python-the-language?

The point of my original comment was not to necessarily critique the Python programming language, rather it was to point out that adhering to the "zen of Python" is a lost cause because the language/development environment is not designed as a curated experience.

And my original comment did make points about Python-the-language. I talked about how there's many ways to do a single task in Python. One of the responses to it even proved my point:

"Prior to that you could use the special two-argument version of the `iter` function which makes it act completely different than the single argument version: <code sample>".

That unfortunately demonstrates my point.

>Every Stack Overflow question has a multitude of answers ranging from imperative to functional style and with various benefits and drawbacks.

This is one of the reasons I love Python. It's a great exercise to rewrite the same code imperative, recursive, with generators, with iterables, etc. Python is very good at supporting a wide range of programming styles.

I see this criticism every time the walrus operator is brought up.

You do know that this:

    x := 1
Is going to be a syntax error, right? The walrus operator is not permitted in the case of a simple assignment statement. It's only in an expression.

But it used to be that any expression on its own was a valid statement. Is that going to change?

When is an expression allowed to have := in it, is

  (x := 1)
on its own allowed?

For contexts where the walrus is not allowed, see [0]. You'll find that it's generally possible to circumvent the restriction by parenthesising the expression. So yes,

    (x := 1)
is a valid (but poorly written) statement.

But while there are now two ways of doing assignment, I wonder how often people will actually encounter situations where it's difficult to figure out which choice is better.

[0] https://www.python.org/dev/peps/pep-0572/#exceptional-cases

Allowed, yes. But the PEP that introduced walrus operators says not to do it.

Every possible line of code has an alternate ugly way to write it. This isn't a valid criticism. Anyone who decides to start writing simple assignment statements like that deserves to be ridiculed for writing ugly code.

Of course, and there's no reason to write such code.

I just dislike that the simple syntax rule "any expression can be used as a statement" now has an exception.

I haven't been able to think of scenarios where that might have consequences (code generation or refactoring tools?) but that doesn't say much as I'm not that smart.

Edit: having looked at the cases that are disallowed, they remind me of generator expressions. Those are usually written with parens, that can optionally be omitted in some cases. := is the same except they can be omitted in so many cases that it's easier to list the cases where they can't.

I think a generator expression used as a statement already requires the parens, even though they can be omitted e.g. as a single parameter of a function call. So that's probably ok then.

Not really, but neither are ugly nested if statements (Flat is better than nested, readability counts, etcetera). You need to make tradeoffs.

Maybe it would have been better to only have a single := assignment operator to begin with. But it's a few decades too late for that.

For what it's worth, := is for expressions only. Using it as a statement is a syntax error. So there won't be a lot of cases where both are equally elegant options.

Regular = can only be used in statements. Walrus := can only be used in expressions. There's no overlap there. However, := does simplify certain expressions (like those nested if-else statements and the common "while chunk := read()" loop), which I think does justify its existence.

This honestly makes it seem more confusing to me. The fact that there is now an operator that can only be used in certain statements just makes things more confusing. And if there really is no overlap, then why wasn't the "=" operator just extended to also work in expressions? "while chunk = read()" seems like it makes just as much sense without adding the confusion of another operator.

One of the good things about not using the "=" operator is that you cannot accidentally turn a comparison into an assignment, a feature that is a common cause of errors in other languages that do support it. By adding a completely different character to the operator it is not very likely to cause bugs, compared to just forgetting to type that second =

Is it really that common? I made this typo a few times in my life. It was corrected every time before the program actually run because the compiler warned me about it. I don't see how you can make this mistake if you're not aggressively trying (by turning off warnings for example).

I guess it is not common, but by using the = operator you would not get the warning, and instead get unexpected behaviour.

I expect the PEP authors want to avoid the "while chunk == read()" class of bugs

ninjaedit: indeed https://www.python.org/dev/peps/pep-0572/#why-not-just-turn-...

And also the converse (but no less dangerous) "if value = true"

> The fact that there is now an operator that can only be used in certain statements just makes things more confusing

The new operator (like many Python operators) can only be used in expressions (statements can contain expressions, but expressions are not a subset of statements.)

> The fact that there is now an operator that can only be used in certain statements just makes things more confusing

Because the “=” operator is the thing that defines an assignment statement. Even if this could be resolved unambiguously for language parsers, so that “statement defining” and “within expression” uses didn't clash, it would be create potential readability difficulties for human reading. Keeping them separate makes the meaning of complicated assignment statements and assignment-including expressions more immediately visually clear.

>it would be create potential readability difficulties for human reading

I think the major argument (at least, the one I see most frequently) is that the walrus operator does create readability difficulties for humans, which is exactly why many people view it as non-pythonic. This is one of the few times I've seen someone argue that ":=" makes things more readable.

An argument against expression assignment is that it can create readability problems compared to separating the assignment from the expression in which the value is used. Even most supporters of the feature agree that this can be true in many cases and that it should be used judiciously.

This is in no way contrary to the argument that the walrus operator improves readability of expression assignments compared to using the same operator that defines assignment statements.

There are at least three ways to iterate over a list and create a new list as a result. That's not very pythonic, is it?

The Zen of Python states:

> There should be one-- and preferably only one --obvious way to do it.

There are plenty of ways to do assignments. Walrus assignments are only the obvious way in certain cases, and in general there aren't other obvious ways. For testing and assigning the result of re.match, for instance, walrus assignments are clearly better than a temporary.

I can think of lots of nonobvious ways to do assignments, like setattr(__module__...)

I can think of more than two ways to do a lot of things in Python. Besides the ":=" doesn't work exactly the same.

Also, I can't bring myself to call it the walrus operator. Sorry, guys. I had a Pascal teacher ages ago who pronounced it "gets" and that has always stuck.

Assignment can be confusing already.

  >>> locals()['a'] = 1
  >>> a
If anything, the walrus operator allows for tightly-scoped assignment, which is good in my opinion.

You don't even need the locals() function to get into trouble:

    x = [1, 2, 3, 4]
    def foo():
        x[0] += 3  # Okay
    def bar():
        x += [3]   # UnboundLocalError
    def qux():
        x = [5, 6, 7, 8]  # Binds a new `x`.

    def bar():
        x += [3]   # UnboundLocalError
This is an especially funky one. x.extend([3]) would be allowed. Presumably x += [3] is not because it expands to x = x + [3]... However, the += operator on lists works the same as extend(), i.e. it changes the list in-place.

dis.dis(bar) shows:

              0 LOAD_FAST                0 (x)
              2 LOAD_CONST               1 (3)
              4 INPLACE_ADD
              6 STORE_FAST               0 (x)
              8 LOAD_CONST               0 (None)
             10 RETURN_VALUE
So INPLACE_ADD and STORE_FAST are essentially doing x = x.__iadd__([3])

This isn't really true. There's one way to do assignment, `=`, and one way to do special assignment that also works as an expression, `:=`. You should always use `=` unless you *need `:=`.

I don't think that philosophy was ever truly embraced to begin with. If you want evidence of that try reading the standard library (the older the better) and then try running the code through a linter.

The idea that str.format produced simpler or more readable code than f-strings is contrary to the experience of most Python users I know. Similarly, the contortions we have to go through in order to work around the lack of assignment expressions are anything but readable.

I do agree that Python is moving further and further away from the only-one-way-to-do-it ethos, but on the other hand, Python has always emphasized practicality over principles.

This is what happens when you lose a BDFL. While things become more "democratic", you lose the vision and start trying to make everyone happy.

Walrus operator is the direct result of the BDFL pushing it over significant objection.

Well, there were 4 versions released since 3.3 that still had a BDFL, so I dunno if that's the issue, yet.

I'm someone who loves the new features even though I don't think they're "pythonic" in the classical meaning of this term. That makes me think that being pythonic at it's most base level is actually about making it easier to reason about your code... and on that count I have found most of the new features have really helped.

You can write very Python2.7 looking code with Python3. I don't think many syntax changes/deprecations have occurred (I know some have).

Yep, I did a 2to3 conversion recently and it got the whole project 95% of the way there. A 3to2 would be in theory almost as simple to do for most projects.

My first though was the same as the snarky sibling comment, but after reading TFA I realized these are all features I've used in other languages and detest. The walrus operator an complex string formatting are both character-pinching anti-maintainability features.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact