
Problems I Have with Python - ploggingdev
http://darkf.github.io/posts/problems-i-have-with-python.html
======
koliber
Some of the author's points are valid. However, many are subjective
preferences, and some are gripes without solutions, and others make it
difficult to understand the author's underlying philosophy.

My main critique is that the author added this statement that puts a negative,
entitled, and naive tone on the whole article:

>>> to which no real improvements are being made for some reason.
(Incompetence? Politics? Both? Who knows.)

The author is not acknowledging that some of his points were not addressed
because of reasons other than incompetence and/or politics. I imagine this
statement could offend some of the smart and hard-working people who are
working on improving the python language.

Reasons the author does not acknowledge:

\- the community does not agree with the author's subjective idea of what
Python should look like

\- the solutions to a problem (I'm thinking GIL) come with a lot of
consequences which are not readily acceptable

\- solving some of the issues would exacerbate backwards compatibility. This
would increase the author's problems even more, because, as he states "This is
particularly a pain for libraries where I expect to pip install them and have
them "Just Work"."

~~~
darkf
>However, many are subjective preferences

Certainly, it is titled "Problems _I_ Have" for a reason. :-) I do not expect
everyone to agree with me, but it is what I feel I personally lack when using
it quite a lot.

> I imagine this statement could offend some of the smart and hard-working
> people who are working on improving the python language.

That was certainly not my intention -- as stated, I do love the language and
appreciate all work going into it. I do not intend to undermine their efforts,
just point out some of my perceived design flaws.

>\- the community does not agree with the author's subjective idea of what
Python should look like

I think we all agree there should be a good solution to concurrency (and
"stackless" variants which power eventlet, etc. have been used for ages; as
has Twisted, of which asyncio is not a sufficient clone.), parallelism, etc.

The standard library in general encourages use of higher-order functions and
concepts borrowed primarily from FPLs (see: comprehensions, map/reduce, sort,
etc.) I could not imagine seeing them backtracking on this -- it only helps
them to go further in that direction.

>\- the solutions to a problem (I'm thinking GIL) come with a lot of
consequences which are not readily acceptable

I did not propose a solution because there are many, as you note; there are,
however, implementations with decent solutions like AFAIK Jython.

>\- solving some of the issues would exacerbate backwards compatibility.

Such as what?

~~~
jsmeaton

      > with decent solutions like AFAIK Jython.
    
      >>- solving some of the issues would exacerbate backwards compatibility. Such as what?
    

Jython can't load native C extensions which should be GIL aware. Most
programs, and the python interpreter itself, aren't thread safe so suddenly
removing the GIL would break a lot of programs.

I agree with you that the concurrency story for python sucks, but claiming
solutions could exist without breaking back-compat is just not right.

~~~
Chris2048
Is Jython good though?

Last I used it is had issues keeping pace, i.e. demonstrable memory issue that
took a long time to fix, lagged considerably behind python 2/3 versions.

It's also worth noting, that as an essentially transcompiled language, you
need to have a good appreciation of Java machinery, in which case languages
like Groovy provide good competition.

~~~
vorg
When talking about JVM languages suitable for building systems, Jython isn't
generally mentioned for the reasons you give, nor is Apache Groovy. Those two
are good for scripting, e.g. testing Java classes, build scripts, glue code.
Besides Java, languages like Clojure, Scala, and Kotlin are usually considered
as systems languages on the JVM.

~~~
Chris2048
True, but I believe Groovy and Jython compete for the same space, if not
system-building.

------
todd8
A language as old and as large and as flexible as Python ends up with a few
wrinkles, but designing a successful languages isn't easy and I really admire
the work done by everyone behind Python, especially the vision and invention
by Guido van Rossum.

A kind of post-experience review of a language's strengths and weaknesses is a
good exercise. For comments and complaints that really were influential in the
history of programming languages see:

Knuth, _The remaining trouble spots in Algol 60_ , Communications of the ACM,
10, 10, 1967, pp. 611--617.
[https://www.cs.virginia.edu/~asb/teaching/cs415-fall05/docs/...](https://www.cs.virginia.edu/~asb/teaching/cs415-fall05/docs/algol-3.pdf)

J. Welsh, W. J. Sneeringer, C. A. R. Hoare, _Ambiguities and insecurities in
Pascal_ , 7, 6, November 1977, pp. 685--696.
[http://onlinelibrary.wiley.com/doi/10.1002/spe.4380070604/ab...](http://onlinelibrary.wiley.com/doi/10.1002/spe.4380070604/abstract)

Brian W. Kernighan, _Why Pascal is not my favorite programming language_ ,
April 2, 1981, AT&T Bell Laboratories.
[http://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-
pasc...](http://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pascal.html)

------
y7
Very short-sightedly written. It sounds like the author just wants a language
with a different philosophy, and instead of realizing this goes on to call the
differences "obvious flaws in design" that aren't improved because of
"Incompetence? Politics? Who knows." This is especially bad given that Python
(in my opinion) has a very well thought-out and transparent change process,
with PEPs that usually consider most alternative solutions to a problem
brought up, and being held to a high standard in order for the BFDL to approve
them.

Especially regarding some of the points near the end, it seems that the author
just doesn't understand Python. Python doesn't have or want a strong typing
system, and it mostly wants an imperative style keeping lines short.

> Well... what's worse, having a slightly goofy looking inline "def", or
> having a gimped language?

Why is it such a problem to move your closure to its own line and give it a
name?

~~~
Grue3
>Why is it such a problem to move your closure to its own line and give it a
name?

What if it cannot have a meaningful name? You don't give a name to every
single value in your program, why would first-class functions be any
different? Like, there's an idiom in python where you write a function named
"wrapper" in a decorator and then return wrapper. Except you put @wraps on it,
so the name of this function isn't even "wrapper" anymore. So what was the
point of naming it "wrapper" when you could just do ```return wraps(lambda
...)``` if lambda wasn't so underpowered.

~~~
y7
The name of the function when you _read_ the code is still wrapper. The point
of naming it is that it allows you to refer back to it, as well as provide the
reader with a clue to what it does. This way, Python attempts to reduce
nesting, which reduces ease of reading. You see this a lot in Javascript, with
all the anonymous functions being passed around. Besides that it makes your
stack trace hard to parse, it also makes it hard to follow if you're currently
reading a named function, or an anonymous function that's being passed as a
parameter inside another anonymous function, etc.

------
lmm
Python's semantics are unlikely to ever be fast and most Python users have
already worked around its speed issues.

asyncio is new; python3 porting is happening. It seems unfair to complain that
no real improvements are being made and also complain that these new things
are immature.

reduce being pushed behind an import is stupid, but it's only one import.

lambda is fine, if you're writing in functional style you're using expressions
for everything anyway. And you will probably be more persuasive if you can
make your points less offensively.

For "bag of data" classes look at attrs.

(All that said, having found Scala I don't miss Python at all. (Except when
writing a desktop GUI - PyQt was really nice))

~~~
darkf
>lambda is fine, if you're writing in functional style you're using
expressions for everything anyway.

No, I /really would/ like to be able to write:

foo.on_click(lambda: x += 1)

The language not supporting this (when most others do) is just silly.

~~~
amyjess
> foo.on_click(lambda: x += 1)

Mutation in lambdas is not compatible with your complaint about "inadequate
support for high-level functional programming", as it goes against the
principles of FP.

You can't do that in Haskell either, and any FP purist would blanch at a
statement like that.

~~~
darkf
>You can't do that in Haskell either, and any FP purist would blanch at a
statement like that.

Obviously you've never met State and/or lens then.

Yes, you can do it -- and no, I never said it should follow _pure_ FP
principles.

~~~
lmm
But if you write a State equivalent of "x += 1" in Python then you can use it
in a lambda too.

------
jknoepfler
I don't understand the desire to turn python into a high performance language.
It's 2017, if you want performance just write some go/cpp/rust. If you want to
leverage an old and very mature concurrency framework, use elixir/erlang. If
you need a giant data integration framework, use java. If you want a stellar
bash replacement, use python. Know your tools, don't bloat them with
unnecessary crap. (The list was not intended to be exhaustive or pick the
best, just to illustrate that we have tools for each type of job).

The idea of single-language buy-in has always perplexed me.

~~~
Pxtl
Because other high-performance languages have been improving their readability
and expressiveness.

Personally, my tool of choice is C# right now. I find it quite readable, and
every bit as expressive as python - even moreso, plus it has the performance
advantages of being designed from square 1 as a compiled language instead of
an interpreted one.

It has async/await, it has functional features that Guido hates, it has
performance, and it's quite legible ever since C#3 included type inference and
you can just write "var" all over the place. It has some warts, but the warts
are worth it.

Imho, python has stagnated. It's still a useful, wonderful language and I
enjoy working in it when I have to, but I never find myself _choosing_ python
for new projects, and I don't see that ever changing.

~~~
ehsankia
But then, if you want to write a script to convert a bunch of files from one
format to another, will you start a whole C# project?

If you want to quickly to a mathematical calculation with matrices, then plot
and visualize it, will you use C#?

"Tool of choice" is misleading. Maybe you are a webdev. Maybe you write GUI
applications for a living. I don't think there's clear "Tool of choice" for
all jobs.

The job dictates the tool of choice.

~~~
Pxtl
For one-offs I use arnova chsell so I can have a repl (good way to make unit
tests, too, imho)

[http://cshell.net](http://cshell.net)

For reusable scripts I make a command-line app. I admit the project structure
is a bit heavy weight for that but I have visual studio open all the time
anyways.

------
asrp
A lot of the author's complaints, especially near the end, are personal
preferences which explains by these "improvements" were never added. Not
everyone would prefer a move to a significantly more functional style.

Arguably this difference is what caused Coconut to be made in the first place.

There's ample discussion on these topics to simply brush existing decisions
off as "Incompetence? Politics? Both? Who knows."

For the fifth point, `list.index` works fine in Python 2 and 3 for me.

~~~
darkf
>Arguably this difference is what caused Coconut to be made in the first
place.

Sure, that and it's far easier to write a new language and transpile than it
is to fork and modify existing implementations.

>There's ample discussion on these topics to simply brush existing decisions
off as "Incompetence? Politics? Both? Who knows."

If you would like to link to such discussions I would not hesitate to add them
as footnotes/amendments.

>For the fifth point, `list.index` works fine in Python 2 and 3 for me.

Sorry, mixed up `index` and `find`. `[].find` is not a function, but
`[].index` is. Thanks for catching that, I keep confusing them (another minor
annoyance of having both).

~~~
asrp
Oops, I didn't expect this to blow up, thanks for responding.

Here's a discussion about switch case (I was also looking for one the other
day, but in my case, it was purely for optimization).

[https://www.python.org/dev/peps/pep-3103/](https://www.python.org/dev/peps/pep-3103/)

And the wiki, for example, talks about the GIL

[https://wiki.python.org/moin/GlobalInterpreterLock](https://wiki.python.org/moin/GlobalInterpreterLock)

I remember reading articles from here about that a few times, but can't find
them now. If you are still interested, I can try to dig it up.

------
ankitml
It is hard to accept inputs from people who dont appreciate that any technical
decision require understanding the tradeoffs. It is true that python is not
perfect, but perfection was never a goal. Python has made some tradeoffs, just
like every other technical system. Also, just like any other technical system
people are working towards improving in a specific direction. They only way to
change or evolve that direction is to be part of the community, understanding
it, and then influencing it. Abusing the community, calling it silly is plain
stupid. The author's goal is clearly not in influencing a change but self
aggrandising and proving himself right.

~~~
kazinator
Influencing the Pythons of this world is generally a waste of time; just go
use that which, today, works the way you want, or make it yourself if it
doesn't exist.

Anyway, are you saying that knowledgeable people who think Python is junk
shouldn't say anything? Only get involved in Python development or shut up?

If someone's words can save just one person from using Python, that's
worthwhile.

~~~
darkf
Nah, I would love people to use Python -- just to help improve it as well
(either through libraries, alternative languages or submitting changes through
official channels.)

Saying I am "self aggrandising" is disingenuous and misleading at best.

------
cybersol
It's definitely not perfect, though ultimately most trade-offs in Python come
down to readability. I once went down the rabbit hole (3 library iterations)
of overloading operators to enable a very shell and pipe oriented syntax, only
to later realize how much harder my 6-month old code was to read even for me.
So I've come to appreciate Guido's experience for the trade-off between
expressive power and readability.

For instance in the things you suggest, reduce used in its most
straightforward manner is readable, but it can also be used to enable some of
nastiest, most head scratching one liners. Lambda is useful for small things
like callbacks, but readability should lead you to a real function with a
descriptive name sooner rather than later.

~~~
collyw
Agreed. I came from Perl to Python.

Perl does feel more powerful and expressive, but it often gives you enough
rope to hang yourself whereas Python doesn't.

------
thomasvarney723
Though I likely haven't completely understood each of the authors gripes, each
problem to me seems to have a notable solution provided by Clojure (with the
exception of tail-call optimization).

Clojure:

is compiled to JVM byte-code and is fast.

has a good parallelism story (parallel map, parallel fold, channels)

is almost completely backwards compatible.

has a sequence abstraction that leverages the same operations over many
different types (string, vecctor, list, set, map, etc.).

has a standard compose function.

has reduce, map and filter in the standard library. Transducers (also first
class) further extend their usefulness.

's closures allow statements and there's even sugar for annonymous functions.

has macros to reduce boilerpate.

has a conditional macro.

I'm sure this list isn't unique to Clojure but I'm most familiar with it.

~~~
jgalt212
All of the above is good except for transducers. They are a necessary hack in
Clojure because the data is immutable running a large number of functions
across changing data is rife with overhead in Clojure. So the hack is mutate
the code many times, so you only have mutate the data once.

~~~
thomasvarney723
I see what you mean and although the definition of a from-scratch transducer
looks a bit ugly to me, the idea of composing existing transducers together
seems rather elegant.

------
willvarfar
I find myself nodding to everything on the list.

I use python all the time. The lack of first-class anonymous functions is
plain irritating.

I would add to the list that nonlocal is horridly limited, that I want to be
able to better do do-while loops and want to better exit from nested loops
more cleanly and such.

------
Grue3
I like this list a lot. I also find myself raging at awful lambda and the lack
of switch. No, it wouldn't make the language any less "pythonic" to make them
useful.

~~~
fleetfox
IMHO switch is horrible construct i'd rather see ML style pattern matching

~~~
OskarS
Switch makes sense in really low level languages like C where it becomes a
branch table (and allows things like Duff's device), but it has no place in
higher-level languages. Fallthrough, while very occasionally useful, is bug
prone and weird (breaks the "principle of least surprise" in a major way).
Python made the right call not including it.

~~~
khedoros1
So do it like "match" in Rust. Fallthrough certainly seems un-Pythonic. It
seems worthwhile to ditch it in favor of pattern matching.

It _also_ seems un-Pythonic to have to implement something in a less-obvious
way. Choosing an action based on a one item from a set of possible inputs
means either a dict that maps to lambdas or function variables or a big chain
of if-else. Neither of those options are optimal.

------
highfestiva
I'd love to have all those problems fixed. I constantly bump in to exactly
those things myself, and I believe a majority of developers would find the
language better with than without remedies. Especially the trivial things
(like flatten, moving reduce back out from functools, lambda state and so
forth) takes no time - it's just politics.

------
Loic
The one problem I have with Python and would like to solve, is to be able,
from within a request rendering function/method of my web application (think
Flask) to run something like:

    
    
        handle1 = call_webservice_1(args)
        handle2 = call_webservice_2(args)
        handle3 = call_webservice_3(args)
        (realres1, realres2, realres3) = wait_until_timeout(500, handle1, handle2, handle3) 
        # here, I have my results in realresX or None if timeout
        # the call_webservice_X would be non blocking
    

This way I can dispatch my requests to my backends and degrade gracefully if
one request fails within the time. I was doing that in PHP using zeromq to
send the requests and listening to the answers with a unique id on each
request, but now I would prefer to stay with an HTTP based protocol to
communicate with my backends.

~~~
orf

        from asyncio import wait, gather, get_event_loop
        from aiohttp import web
    
        async my_handler(request):
            handle1 = call_webservice_1(args)
            handle2 = call_webservice_2(args)
            handle3 = call_webservice_3(args)
    
            await wait(gather(handle1, handle2, handle3), 500)
        
            return web.Response({'finished': True})
    
    
        app = web.Application()
        app.router.add_route('GET', '/test/', my_handler)
    
        loop = asyncio.get_event_loop()
        server = loop.create_server(app.make_handler())
        loop.run_until_complete(server)
    

Done. Not flask, but if you need to make lots of parallel network calls during
a web request why the hell are you using flask?

~~~
spookylukey
That's really helpful. Can you make an example which doesn't require making
the my_handler function 'async'? For cases when you don't want to make an
entire async stack, you just want to slot some async code into your existing
code.

If the answer is "to get some async goodness, just use this easy code, plus
rewrite your entire project to use a different framework and set of
libraries", then we are only fooling ourselves.

~~~
AnkhMorporkian
You need to use the asyncio (or equivalent) event loop if you want to use the
asnycio module. loop.run_until_complete() is synchronous though, so you would
simply call that and it will block the control flow despite that function
being async. You can definitely mix it with legacy code.

I would recommend against it, but if you had an existing framework, you could
just make the endpoints lambdas that are something like:

    
    
        app.route("/whatever", lambda: loop.run_until_complete(async_handler_function()))

~~~
orf
^ this, but be aware that it's only worth it if you do > 1 external call in
parallel. I.E this is pointless:

    
    
        res = await get('https://somesite.com')
        return Response(res['data'])
    

As you'd get the same thing if you just did it synchronously (without the
await). But if you want to fetch 2 or more pages in parallel when it really
pays off.

~~~
AnkhMorporkian
Yeah, absolutely. I am not a fan of mixing synchronous and asynchronous code,
but the design of asyncio makes it very easy to do. I think that most people
struggling with the concept don't realize that asyncio is inherently blocking
when its being used (well, with the caveat of run_in_executor, but that's best
left ignored for the purposes here)

------
gamesbrainiac
I think there are many people out there who use python because its practical
and useful to them, but they don't love it, and that is completely fine.
Anything good is a compromise between different groups and Python is no
exception to that rule. I think Python does a decent job at appeasing both
functional zealots as well as objection oriented fanatics.

------
ascotan
>Parallelism is very bad on CPython and PyPy;

GIL was added because the early python libraries were not written to be
threadsafe. This was a terrible oversight and frankly should have been
corrected at some point. The underlying implementations use pthreads. It's
actually worse, because as multi-core devices came out the "lock-thrashing"
behavior of the GIL got worse. Rather than fixing the problem we have
'multiprocessing'. I still don't understand why this hasn't been tacked.

> Quite a few legacy projects are written in Python 2, and it can take some
> work to port them. This is particularly a pain for libraries where I expect
> to pip install them and have them "Just Work".

Python 3 was DOA. I don't want to be overly critical here, but there was no
compelling reason to switch because python 3 didn't have anything
fundamentally more interesting than python 2. It didn't really fix any of the
serious language issues (like the GIL). It was almost like like the python
version of windows vista or ipv6. People want you to switch, but meh.

>The BDFL himself, Guido van Rossum, has infamously declared that he does not
like functional programming

And now you've come to the heart of the matter. There have been some amazing
tweaks of python (stackless, pypy, twisted greenlets) some of which have been
attempted to be merged into the greater python. Most of which were rejected.
At some point people give up and walk away. For better or worse Python is
Guido's language. Take it or leave it. No switch statement for you buddy.

I think Python is a fantastic language. It has become ubiquitous. For all it's
warts, it's lack of change has probably helped it's adoption. Literally
EVERYONE writes python code. Network guys, sysadmins, even your manager (or
your manager's manager) probably has some python code stashed away somewhere.

However, I don't feel that Python is not doing enough to catch up. Print as a
function does nothing for me. There are so many, many, many, many warts (which
i won't get into) and yet python seems to be polishing the chrome rather than
fundamentally fixing it's core problems.

I want to use and love python but it's become "the devil you know" so to
speak. I've lost hope that python will adapt to the future and have put my
bets elsewhere.

~~~
astamatto
Where are your bets now? @.@

------
Siecje
The biggest problems with Python are packaging and distribution.

It would be nice to create a single file and be able to send it to someone,
like golang which even has cross compilation.

Mobile support, you can't easily write a mobile application in Python.

~~~
darkf
>Mobile support, you can't easily write a mobile application in Python.

Depends on your needs, but there is at least Kivy.

------
metaphorm
my perspective on this is that that is a remarkably short list of complaints
for a programming language, all things considered.

Python is certainly not perfect, but a similar list of pet peeves and
grievances for, say PHP or Java would easily be 10 times longer even under the
most charitable interpretations.

------
tacostakohashi
The problem I have with Python is that for loops don't have their own scope,
only methods.

Add that to the lack of variable declarations (even optional ones, a la my in
perl, var in javascript), and it gets hard to work out what scope of any given
variable actually is.

Surprising example:

    
    
      fns = []
    
      for n in [1,2,3,4]:
        def fn():
          print(n)
        fns.append(fn)
    
      for fn in fns:
        fn()

~~~
Shizka
But doesn't this just happen because n is a pointer? What would you expect it
to print? 1,2,3,4?

~~~
ufo
In Lua it prints 1,2,3,4. It has to do with each loop iteration behaving as if
it declared a different variable instead of sharing the same variable across
the loop.

Anyway, the problem they were talking about is clearer when you are closing
over stuff that other than the loop variable:

    
    
        fns = []
        for n in [1,2,3,4]:
            x = n*10
            def fn():
                print(x)
            fns.append(fn)

~~~
bb88
Yeah... so I agree that's confusing. It's akin to:

    
    
        def x(y=[]):
            y.append(1)
            return y
    
        for z in range(4):
            print x()
    
    

For your case, I recommend using partial functions, which were created for
this type of issue. I think it's also cleaner than closures where x depends on
an outside context.

------
grondilu
> Quite to the point, lambdas (anonymous closures) in Python are gimped. They
> are single-expression functions, which means no statements, even
> global/nonlocal qualifiers.

I remember once on rosettacode I wanted to write a Runge-Kutta function in
Python with a lambda. I was stopped by the lack of variable assignment, until
I remembered that they can be emulated by nesting function calls:

    
    
        def RK4(f):
    	return lambda t, y, dt: (
    		lambda dy1: (
    		lambda dy2: (
    		lambda dy3: (
    		lambda dy4: (dy1 + 2*dy2 + 2*dy3 + dy4)/6
    		)( dt * f( t + dt  , y + dy3   ) )
    		)( dt * f( t + dt/2, y + dy2/2 ) )
    		)( dt * f( t + dt/2, y + dy1/2 ) )
    		)( dt * f( t       , y         ) )
    

[https://rosettacode.org/wiki/Runge-
Kutta_method#using_lambda](https://rosettacode.org/wiki/Runge-
Kutta_method#using_lambda)

~~~
darkf
... Yeah, that's gnarly. :D That is emulating `let` using lambdas, though, and
not mutable assignment. Still useful if you really want to nest them, but
still immutable.

~~~
grondilu
I don't know, isn't it possible to do the equivalent of mutable assignment if
I use the same variable name several times?

For instance for the equivalent of x = 3; x = x + 1; print(x):

    
    
        (lambda x: (lambda x: print(x))(x+1))(3);

------
OJFord
Personally I'd love to see pattern matching / de-structuring of dicts and
strings:

    
    
        foobar = "foo{}".format("bar")
        foo = "{}bar".unformat(foobar)
    
        def dict_returner():
            return {'foo': 1, 'bar': 2, 'foobar': 3}
    
        {'foo': newvar1, 'bar': foo} = dict_returner()
    

I find it strange tuple unpacking exists, but not anything equivalent for
dicts - I can't see that it would be horribly inefficient? Especially
considering that one would be unlikely to use it with more than a few keys.

An extension to that providing a set of keys would be nice, too:

    
    
        foo_and_bar = dict_returner(){'foo', 'bar'}

------
sametmax
> The standard interpreter bring rather slow;

I'm tired about this one.

In the last 13 years, 97% if the projects I worked on didn't need Python to be
any faster, it was not the bottleneck. The remaining ones could leverage some
solution to bypass the problem. Python speed is indeed an issue to a few
people, but it's not the red flag I can read about here and there.

I've been hearing this argument for ever. PHP is slow. Java is slow. The first
one powered the Web for 10 years, the second one is the most used language is
the world. A lot of time this argument is like hearing "I want a pony".

Actually the rare persons I met really needing speed never complained. They
are usually hardcore professionals, and are already working on solutions.

Let's now talk about solutions.

Python is an interpretted and very dynamic language. It's though to speed up.
If you look at the C code, you'll see the Python VM is quite well optimized
already.

Now the author says:

> no real improvements are being made for some reason

But there have been:

\- psyco

\- unladden shallow

\- stackless

\- numpy and a lot of compiled extensions

\- pypy

\- pyston

\- pyjion

\- nuikta

\- cython

\- numba

People ARE actively working at the problem. It's a HARD problem which is why
we don't have yet a definitive solution. And a lot of people working on it are
non paid for this.

Yeah, JS became faster. You know how ? Google spent millions and hire a bunch
of geniuses just to do it.

In 2011, the Python Software Foundation had ‎$750,000 to spend for the whole
operation, including maintaint pypi, the documentation, the official website,
the conferences they do and the various grants they provide. Even the few dev
that are paid to work on Python (e.g: Guido) have to do it only part time.

So the authors worked with Python for 10 years. He made a living out of a free
exceptional software and complain about a problem he may even doesn't have
while people are working their ass off to solve it. And we writes an
aggressive rant about it.

> Parallelism is very bad on CPython and PyPy

Yes, again, this is a HARD problem. Python is very old. Older than Java. We
only had multi-core recently. We can't destroy mono-core perfs to get multi-
core, and have to mainteaint a legacy code base.

We also have:

\- a good multiprocessig story;

\- 2 good async stories;

\- tooling to pre-spaws, manage and scales processes;

\- tooling to create task queues.

So while the community is trying, for free, to solve the problem. We have
solutions. It's not perfect. But again what's the point of complaining like an
hungry child à 4 o'clock unhappy it's not yet dinner time ?

> asyncio does not seem very well integrated, and does not seem as useful as
> libraries like eventlet. They seem to have wanted to reinvent Twisted, but
> did so half-assed and did not include useful protocols (Twisted has line-
> based protocols, HTTP, etc. built in and easily subclassable.)

What is he talking about we just got it ? How do you expect it to be well
integrated yet ?

And Twisted is a framework (a very hard to use one) while asyncio is a low
level lib.

eventlet doesn't let you choose where to switch context, it's basically like
threads. We already have threads.

> Quite a few legacy projects are written in Python 2, and it can take some
> work to port them. This is particularly a pain for libraries where I expect
> to pip install them and have them "Just Work".

When the last time didn't that happen for anybody ?

Seriously:

[http://py3readiness.org/](http://py3readiness.org/)

I've been coded in Python 3 for the last 2 years. It happened twice. Both time
I was able to convert the code base in a few minutes. I said minutes. Not
hours.

> There is an official tool 2to3 which does not work in all cases.

And the break in your car doesn't work in all cases either. Still it's a nice
break.

Plus you got six and Python-future. Converting any pure-python code base is
not hard. Compiled extension is harder, but my guess the author never needed
to code one.

And i'll say it again...

People have 15 bloody years to migrates. It's not like JS tools breaking every
3 months. It's not like PHP skipping the version 6 or Perl taking 10 years to
get V6.

No. Python 3 arrived quickly after many warnings. Then tools, tutorials and a
looooooooooooot of time have been provided.

This is nowhere Python's fault. It's the best damn migration story I've ever
witnessed in my life.

My only grudge on Python 3 is that it didn't break ENOUGH. I wished for stuff
to have changed more.

> The standard library is sometimes inconsistent

One of my pet peeves as well.

> The BDFL himself, Guido van Rossum, has infamously declared that he does not
> like functional programming (odd, considering the language is built around
> FP concepts), and that map/reduce/filter should not be in the language. Well
> -- in my opinion that is a grave mistake, but more importantly the language
> suffers.

The author doesn't like the style of the language. So it's a matter of taste.

Well I like it that way.

Now what ?

> reduce is now tucked away inside the functools module (as of Python 3), even
> though it is the only one of map/filter that is not replaceable by
> list/set/dict comprehensions! Yet map and filter are still in the base
> global environment. What sense does that make?

Yes it does because reduce is seldom used. Grep github and you'll see. map and
filter are still in the built-ins because people like the author complained a
lot on the mailing list.

Yet, the majority of code base I read, including most of the libs I use (I
spend a lot of time reading the content of my site-packages) don't use
map/filter since we got comprehensions.

> I often find myself reimplementing flatten as flatten = lambda xs:
> itertools.chain.from_iterable(*xs)

Use comprehensions to flatten. Learn you language for van Rossum' sake !

(y for x in xs for y in x)

> The lack of tail call optimization in most implementations makes writing
> tail recursive algorithms rather pointless, unfortunately, even when they
> may be more legible than their iterative counterparts.

> There is no standard way (even in functools) to compose functions. There is
> partial application via functools.partial, at least...

Again it's because recursivity is not encouraged in Python. It's the
philosophy of the language. One can dislike it but it's not a Python problem,
it's a Python decision.

I stay in Python precisely for this. Everytime I go read functional heavy
code, it's hard to read. I'm an expert coder and trainer. I'm paid up to
900€/day. Most code should be easy to understand given my experience. When
it's not, I consider that a bug.

Functional lovers write smart code. I hate reading smart code. I want code
that is easy to debug.

If you really need TCO, like when implementing a state machine, there are
solutions:

[http://neopythonic.blogspot.fr/2009/04/final-words-on-
tail-c...](http://neopythonic.blogspot.fr/2009/04/final-words-on-tail-
calls.html)

Not as elegant, but good enough since it's a rare occurence you do need it.
Again. Rare.

The language is optimized for regular use cases and readability, not smart
formulas.

> Lambda is awful

Lambda is wonderful. It keeps people from writting budge inline callback like
they do everywhere else. It's the best decision Guido every took.

Xith lambda + decorators + list comprehension, the need for multi-lines
callbacks is not huge.

You want more ? Write a regular function. How hard is it ?

It's not hard. So eventually it's matter of...

... wait for it ...

taste.

I would have liked a shorter keyword though. But I can live with it.

> Inadequate data modelling facilities

"Inadequate data modelling facilities" because classes are verboses ? Overkill
title much ?

Beside, if you just need a container, you use a dict in Python. Not a class.
At most you use SimpleNamespace:

>>> from types import SimpleNamespace >>> SimpleNamespace(a=1, b=True)
namespace(a=1, b=True)

But again this is "pony"-worth complaining.

I do think classes are too verbose in Python (I use the attrs lib because of
this). But this is childish.

algebraic data type and the whole dunder methods vs interfaces are more
interesting debates.

> Lack of switch (or match)

> No, dicts with lambdas (see above) are not a replacement. No, long if-else
> chains are not a replacement. I want a nice way to match on data (preferably
> richly -- as with ADTs, ranges, ...) and associate matches with logic

Yes they are for switch. Since is the most overrated statement after go to.
It's uneeded, as you can express it's logic perfectly without it. And again,
"rare use case". There is nothing wrong with a bunch of if or a dict.

Now for match, it's a different story. Pattern matching would be a nice
addition for Python IMO. But again things like:

> Please do not suggest awful hacks to do this, and fix your language instead.

Is arrogant and ignorant.

The mailling list have been discussing it for years. It hasn't happen because
there no such thing as a magic way to make everybody agree then implement it
and maintain it for free.

Things have cost. People have taste. Code base have legacy requirements.

~~~
olau
"No. Python 3 arrived quickly after many warnings. Then tools, tutorials and a
looooooooooooot of time have been provided. This is nowhere Python's fault.
It's the best damn migration story I've ever witnessed in my life. My only
grudge on Python 3 is that it didn't break ENOUGH. I wished for stuff to have
changed more."

I think you wrote a good comment, thanks!, but regarding 2 to 3, I think you
got this wrong. I think a more gradual transition would have helped - people
ended up putting off the porting work, which was easy because Python 3 was
installable in parallel, and nobody was really using it etc., and then it
really went dead for some years, which I think was counter-productive for
everyone.

I think in general it's better to keep some compatibility glue code around
until most people have migrated instead of letting a let's-clean-this-shit-up!
frenzy prevail.

(Now hindsight is everything, etc. etc.)

~~~
sametmax
The problem I have with this theory is that the JS community and Ruby
community had big breaking changes, dev told them to fuck off. The community
adapted quickly.

Python took care of the community, giving time, tools and doc. And nobody
moved. But they surely complained a lot.

What does that say ?

~~~
amyjess
In the case of Ruby, it's because the vast majority of the Ruby community is
tied to a single framework. Ruby developers go where Rails goes.

~~~
ThatGeoGuy
But this just highlights the case with JavaScript even more. In JavaScript
nobody is tied to any framework, and they can easily leave for another browser
at any time. Heck, with jQuery, it doesn't even matter if you're writing
ECMAScript 3 or 6, everything still pretty much works the same out of the box.

At the end of the day there's a lot to be said about the transition from 2 to
3, but I think in general the Python community got off easy compared to some
of the breaking changes in other language communities.

------
protomok
One of the biggest Python issues I see is the inability to hide or protect
Python source code.

'Compiling' into byte code is easily reversible using pip packages like
uncompyle2. Various pip packages offer code obfuscation but from my tests
cause problems when running the code. Encrypted bytecode seems to always be
decryptable due to the very nature of having an interpreter. Moving Python
code into modules implemented in C somewhat works but is time consuming and
makes me consider just rewriting everything in C/C++ :(

I would be curious to know how other folks hide/protect Python code? I see
this issue as a major barrier to getting Python adopted in paranoid tech
companies!

~~~
nneonneo
Naively compiled C/C++ is fairly easy to reverse engineer (I say this from a
lot of experience!).

If you want to "protect your source code" you need to apply obfuscation
techniques to slow down a reverse engineer - but keep in mind that everything
ultimately can be reversed and understood given enough time. Plus, many
obfuscation techniques can be made applicable to Python code too (e.g.
encrypting, obfuscating or mangling Python bytecodes).

The real question is: what are you protecting that is so secret? If it's
details about a protocol (network messages, file format or external API calls)
those are fairly easy to dissect externally. If it's a proprietary algorithm,
someone could blackbox the relevant parts of your code to use in their own
application, without even reversing it. If it's proprietary data, client-held
keys, etc. there are ways to get at it. Assume that everything you hand a
client is no longer secure or private - if you really need to keep secret
sauce close to home, make it server-side.

~~~
jgalt212
> Naively compiled C/C++ is fairly easy to reverse engineer (I say this from a
> lot of experience!).

So I assume your position on reverse engineering Python bytecode is that it's
trivial.

~~~
nneonneo
It can be, since Python is a higher-level language, but it isn't necessarily
easier. For one, the state of decompilation technology is much more primitive
for Python - the decompilers I've used are more like pattern matchers and
break if you even slightly tweak the bytecode or use fancy constructs.

Second, although Python by default outputs plenty of symbolic data to assist a
reverse engineer, these can be stripped (just like a C/C++ binary can be
stripped), leaving you with a bunch of duck-typed method calls and operations.

------
d0mine
I've seen most the points discussed several times already on python-ideas,
python-dev lists -- that is at the very least some of the points have merit
and if the author has anything new to add then these lists might be also the
place to do it.

> Even weirder, str and list both have find, but list does not have index (a
> related method).

It is in reverse: both str and list has index() methods. str has find(). To
find out whether an item is in the list in Python:

    
    
      if item in your_list:
          ...
    
    

> class FooNode

There is _attrs_ package [1], to avoid boilerplate for a mutable analog of
collections.namedtuple ("case classes" [2]):

    
    
      @attr.s
      class C(object):
          x = attr.ib(default=42)
          y = attr.ib(default=attr.Factory(list))
    

> Lack of switch

It is hard to discuss it without a specific code example from an existing
popular codebase that shows the advantages of "switch" statement (to compare
the current code and how it looks like with a suggested "switch" syntax). To
justify a new syntax you should be able to find dozens of applicable examples
easily.

[1] [https://pypi.python.org/pypi/attrs](https://pypi.python.org/pypi/attrs)
[2] [http://www.codecommit.com/blog/scala/case-classes-are-
cool](http://www.codecommit.com/blog/scala/case-classes-are-cool)

------
pcattori
The point about "Inadequate data modelling facilities" is why I wrote Maps:
[https://github.com/pcattori/maps](https://github.com/pcattori/maps) .
Specifically, the "Named Maps" variants provide the same interface as
`namedtuple` but for different levels of immutability/mutability.

Feedback/suggestions welcome!

~~~
brandojazz
I use the beta version of this library all the time in my code! It was really
simple and my code was cleaner, easier to read and write! (link to beta
version of library:
[https://github.com/pcattori/namespaces](https://github.com/pcattori/namespaces)
). Thanks and I am looking forward to try out Maps now!

------
nicwest
could someone explain to me why

    
    
        >>> ranges = [range(i) for i in range(5)]
        >>> [*item for item in ranges]
        [0, 0, 1, 0, 1, 2, 0, 1, 2, 3]
    

would be better than:

    
    
        >>> ranges = [range(i) for i in range(5)]
        >>> [item for subrange in ranges for item in subrange]
        [0, 0, 1, 0, 1, 2, 0, 1, 2, 3]

~~~
shawabawa3
Less verbose.

Also, what happens if you have even more nested lists you want to flatten?

Personally I think it should just be

    
    
        flatten(range(i) or i in range(5))
    

With flatten in the global namespace

~~~
OskarS
Given that flatten is surprisingly tricky to get right, it really should be
built-in. The naive recursive variant will crash python if you nest lists
beyond the stack limit, which is very no bueno. A list that's nested 10,000
layers deep is not especially hard to create or store in memory, and a flatten
implementation should be able to handle it without crashing the interpreter.

In fact, it's not a bad little programming exercise: making a flatten that
performs well and never crashes because of stack overflow.

~~~
toyg
I've not tried, but you can probably get that by recursively mixing standard
constructs and functools.chain.from_iterable().

~~~
OskarS
You should try. It's harder than it seems.

(I mean, it's not the most challenging problem ever, but most programmers look
at it and go "that's trivial, just do X!", and it's a bit trickier than that).

------
gigatexal
I agree on lamdas and the gist that things must look pythonic to be accepted
as that holds the language back from iterating or evolving. The rest of his
points read like scope creep in that he seems to want the language to be
something it's not.

------
jogjayr
Serious question: is the whole deal with the Python GIL solvable if some BigCo
decides to throw a ton of money and engineers at it? Like Google with V8, for
instance. Or is it a truly hard problem that will take something special to
solve?

~~~
stcredzero
For awhile, Python was one of the 4 approved languages at Google. The answer
is "probably no." If it were easily solvable, Google would already have thrown
money and engineers at it.

~~~
paulmd
See my sibling comment to yours - but care to explain your thoughts more?

Jython already fixed the GIL problem. The problem is the legacy codebase built
on assumptions of non-concurrency, which is just a matter of engineer time
i.e. throwing money at it, plus getting GVR to sign off on it.

~~~
stcredzero
_Jython already fixed the GIL problem. The problem is the legacy codebase
built on assumptions of non-concurrency_

Which is to say that Jython didn't completely fix the GIL problem, from the
POV of a lot of people.

------
toyg
This is a tired, trolling post. Most of these issues have long been addressed
as non-problems or personal preferences; when the author says "Incompetence?
Politics?" what I hear is "people don't listen to me, probably because I don't
know what I'm talking about". The attitude is confirmed by his/her conflating
of stdlib gripes and language gripes - two _very_ different sets of problems -
and mixing requests for speed with requests for more lambda support, two
things that are notoriously unlikely to go hand-in-hand.

~~~
vegabook
you seem confused. Please explain how lambda and performance are "notoriously"
unlikely to go hand-in-hand. They're orthogonal, yes. "Notorious"..what does
that actually mean?

Also I will disagree that stdlib and core are "very different problems".
Exhibit A: Go delivers stdlib and core language together, hand-and-glove
style, with out-of-the-box huge functionality. It's one reason why it's
killing Python. Stdlib is a key part of language functionality and is
intricately linked to uptake. Just ask Ocaml.

------
hpaavola
Writing self all the time. And the lack if switch. Everything else is great.

------
Pxtl
I honestly don't get why they made the big compatibility-breaking move to
Python 3 without using that opportunity to change things for better
performance and no GIL.

~~~
vegabook
this is the essence of the problem for Python's long term future. They've been
so burned by the 2-to-3 mess that nobody will ever dare touch the fundamentals
again. As you say, some of this stuff (performance, multicore) should have
been slotted into 3 since it was breaking-change already, even if delaying it
by a few years. Then _everybody_ would have moved, pronto. Now, even if 3
finally snuffs 2 out, we'll be stuck with the fairly unsatisfactory 3
underlying architecture essentially forever.

~~~
BuckRogers
Please don't encourage them. I'm fairly certain the CPython core dev team will
take almost any suggestion like this as a challenge and break everyone's code
again in Python4. They see it as stabbing back at those corporate freeloaders.
Or at least that's the public front. I'd just like them to take lessons from
Go and actually get unicode right. I'm far more interested in Grumpy. Python2
and Grumpy seems like more of a ace in the hole than Python3.

~~~
vegabook
Well said. Google hired Guido, and from being a big Python shop was so
disenchanted with Unladen Swallow's abject failure that they invented a whole
new language to replace it, and Guido became surplus to requirements. And the
last vestiges of Python are now to be piped through Grumpy. Not a single
mention of 3.x and Google in the same breath. Py27+Grumpy looks great.

------
numlocked
For flatten, use:

    
    
        flattened = sum(list_of_lists, ())

~~~
ThatGeoGuy
This only flattens one level. Consider the following snippets:

    
    
        ; CHICKEN Scheme
        #;1> (flatten '((1 2 3) ((4 5) 6) (7 (8) (((((9))))))))
        (1 2 3 4 5 6 7 8 9)
        #;2> (apply append '((1 2 3) ((4 5) 6) (7 (8) (((((9))))))))
        (1 2 3 (4 5) 6 7 (8) (((((9))))))
    

vs.

    
    
        # Python 3
        >>> sum([[1,2,3], [[4,5],6], [7, [8], [[[[[9]]]]]]], [])
        [1, 2, 3, [4, 5], 6, 7, [8], [[[[[9]]]]]]
    

There's a big difference here. Flattening a list to just the elements inside
isn't terribly hard, especially in a language like Scheme with tail-recursion,
but flatten is definitely something that should be in the standard library.
The "flatten" you propose is really just appending the elements of the first
level of the list.

~~~
brettcannon
See [https://bugs.python.org/issue27852](https://bugs.python.org/issue27852)
for a discussion as to why there isn't a more general flatten().

------
rushi_agrawal
I realize that after reading the article, most people (including me,
unfortunately) read the article as 'Problems WE have with Python'. Maybe a
line by author at the top or bottom of the article, reiterating that it's the
problem 'he' has with Python -- I know nobody would think such a second
clarification would be necessary, but hey, we're humans! -- would help.

~~~
darkf
Yeah, I'll keep that in mind -- some people, even programmers, apparently
don't like to read carefully. :)

------
stared
[http://coconut-lang.org/](http://coconut-lang.org/) looks really interesting
- it seems it is a patch exactly for parts of Python I am missing.

(Though, not sure if want to use another language just that case. Vide
CoffeeScript and JavaScript; in this case JS absorbed the best pars of CS.)

~~~
darkf
> in this case JS absorbed the best pars of CS.

Actually my favorite part of CS is the instance var intitialization. e.g.:

    
    
        constructor(@x, @y) ->
    

would initialize @x and @y to the arguments. It makes writing records much
nicer.

------
zde
Surprised nobody mentioned the "Python unicide"

[http://lucumr.pocoo.org/2014/5/12/everything-about-
unicode/](http://lucumr.pocoo.org/2014/5/12/everything-about-unicode/)

------
xkxx
> Even weirder, str and list both have find, but list does not have index (a
> related method).

I believe you meant to write "str and list both have _index_ , but list does
not have _find_ ".

------
riprock
My data structure wish list:

\- heapq to support max heap better. (you can invert the value or use
heapq._heapify_max, neither is ideal.)

\- Tree map.

------
foota
Does any language have split on a list?

------
codr4life
I've spent soo much time struggling with Python over the years, traveled
across Europe for PyCon and tried to tune into the community. I really wanted
it to be the good enough Lisp that Norvig claims it is. But in the end I
always come out of it swearing to never touch the inconsistent, arbitrary,
pile of exceptions again. Conceptually, it's C++ in scripting language
clothes.

------
SFJulie
I love how so much person focus on the GIL and multithreading, when GIL is
much more a solution to make un-threadsafe libs safe to use, and that most
people don't see POSIX threads are an inherently broken abstraction. [1]
[http://www.daemonology.net/blog/2011-12-17-POSIX-close-is-
br...](http://www.daemonology.net/blog/2011-12-17-POSIX-close-is-broken.html)

In fact it pretty much boils down to signals being broken on unices [2]
[https://lwn.net/Articles/683118/](https://lwn.net/Articles/683118/)

Which even though I have a hatred for systemd, systemd is trying to fix by
leaving the status quo. However, POSIX signals are still a problem to systemd
[3]
[https://github.com/systemd/systemd/issues/1615](https://github.com/systemd/systemd/issues/1615)

Having played with signal in python+C, I have the experience of python having
some holes around the signals: no mask can bet set. I thought initially python
sucked because of the most common denominator problem of system languages
(having to make you support only small subsets of features). But, I am now
thinking POSIX signal are just a broken OS level software interrupt
implementation.

So going down the rabbit hole, after reading Stevens on unix/POSIX programming
(a must read). I am pretty much thinking questioning fred brooks (hence K&R&T)
biggest failure: OS360 followed by unices.

What if our quest for a multitasking portable OS that does not care about the
HW is doomed?

It makes a darn good job for 99.999% of the case. The .001% remaining being
the signals.

Look at it, what is a process meant to be?

A container running code.

A thread? Cooperative code sharing data. But how do you cooperate? You send
signals.

The problem, it is in case of high use of signals the OS get "signal bound" in
a way we cannot measure.

signals are like a huge software bus that is not easily measurable and at the
opposite of a lot of primitive cannot be HW bound. It is basically a software
bus that tries to convey the concept of HW interrupts that are normally
handled with micro chips. Look at the MC2828 brochure and you can recognize
the feature signals are trying to provide [4]
[https://upload.wikimedia.org/wikipedia/commons/3/31/Motorola...](https://upload.wikimedia.org/wikipedia/commons/3/31/Motorola_Microcomputer_Components_1978_pg10.jpg)

So to wrap up, we may have a problem of HW architecture that results in a
buggy implementation of a common API. Like trying to emulate MMU on a MMU less
CPU.

And I would say that it is thanks to my experiments in python that I
discovered that signals was an unreliable 1bit message delivery protocol.
Python made it easy to experiment.

Python has problems. (mostly a weired mix of conservatism and progress on
concerns I don't share and politics). But overall it is a good system language
that deals with problem. And poor support of signals, threading are not a bug
from python.

A good system language does not try to fix system glitches, he let them stay
obvious. All the hate against GIL/threading/signals/weired async IO may be
better directed at the quest for a portable multitasking generic OS.

Threads (and implicitly signals) on the other hand are convenient fantasies
that we would like to exist but are actually just fantasies. And to solve the
problem, we invented the containers... based .... on cooperative multitasking
system ... based ... on threads and signals.

~~~
dkersten
The GIL doesn't magically make un-thread safe code thread safe. It makes
Pythons reference counting implementation thread safe.

~~~
SFJulie
The core dev having worked on new GIL (py3.2) explained this to me.

I never said it was magic.

But he said GIL is a tool to achieve thread-safety in python when calling non
'thread safe' code.

I am not him, I will not take on any argument of how it works.

But since ruby GIL is inspired by python GIL let's hear ruby coders:
[http://www.rubyinside.com/does-the-gil-make-your-ruby-
code-t...](http://www.rubyinside.com/does-the-gil-make-your-ruby-code-thread-
safe-6051.html)

Oh, yes, it seems some people are seeing it my way and that it is a
controversial point that can be argued. So I agree to disagree.

Btw, I don't multithread and share states, I multiprocess with 0MQ and
communication patterns such as PUB/SUB PUSH/PULL for the obvious reasons that
I really think multi-threading is an over-valued and wrong abstraction.

~~~
__david__
> I really think multi-threading is an over-valued and wrong abstraction.

That's as wrong-headed as thinking it's the only good abstraction. Each style
has its place—it really depends on what the code needs to do.

~~~
BuckRogers
I'd tend to agree with the parent that multi-pthreading is a poor abstraction
but I'd add an important note: only systems programmers should be explicitly
spinning up pthreads. Explicit pthreading is a poor abstraction for everyone
else. Application programmers are probably operating at the 'wrong' level and
greenthreads make more sense.

~~~
dkersten
Real threads are great for implementing things like clojure's core.async or
task systems like Intel's Threading Building Blocks. Both of these are super
useful and don't work as well with processes instead of threads. I definitely
agree that most programmers shouldn't be using threads directly, but they
should be available as a building block for higher level abstractions.

------
leog7
Why cant the author submit some of these changes ? Its easy to rant

~~~
darkf
Same reason you're complaining in a comment instead of doing things.

~~~
takeda
He's complaining at you complaining :P

