Hacker News new | comments | show | ask | jobs | submit login
Python 3 is killing Python (medium.com)
228 points by sebg 1031 days ago | hide | past | web | 298 comments | favorite



I doubt the lazy people who haven't moved to python3 will ever do a decent port of python 2. Python3 has been out for over six years now, with warning before that.

I'd much rather bet on the python developers who are shipping software. All major python libraries have been ported to python 3, many of them for years. There's 4799 packages registered on pypi as supporting python 3.

Python is becoming less popular? The author provided no evidence at all. Downloads of python3 from python.org are higher than python 2, and have been for a while now. Job websites show python having more postings than 2 years ago.

Not much has changed in python 3? What rubbish. You just have to look at the thousands of commits, and change lists. Python 3.4 has generic functions to do single dispatch matching on type. You can statically type check python now (see pysonar2, pylint, and pycharm for example). Not only as pypy made a major breakthrough with performant STM based concurrency(no threads), but threading has been improved in python 3 too. There is the new asyncio library for really elegant async, which go along with great generator improvements(yield from). We have CFFI for simpler FFI. Lambdas are kept simple for readability, as functions are preferred.

I'm just going to stop here... many of his statements are just insane. Like this one: "You might as well ask someone to port their entire codebase to Ruby". Um, yeah... porting a codebase to python 3 from python 2 would take more time than porting it to an entire other language. What complete nonsense!

Edit: the post isn't from Microsoft, so fixing it so it isn't misleading. There was a link to a Microsoft person on the top right of the page, which seemed to indicate he was the author. Instead it was just someone promoting the page. Oops/sorry!


I'm the author. I've never worked at Microsoft (where did you get that?). I'm a Linux guy.

The "lazy people" you refer to is 98% of the Python community.

Python 3 didn't "gain" static type checking. You can (and I do) run pylint against Python 2 code which is just static analysis, not full type checking. The situation here has not changed between Python 2 and 3.

Python is glue code. You write a small piece of Python to tie together external libraries. Moving a smallish piece of Python code from Python 2 to Python 3 often means porting several 3rd party libraries. If you were to move to Ruby, all you'd need to rewrite is your own code, as there are likely already 3rd party libraries which need no porting.


> Python is glue code.

Um, no. Major applications are written in Python. It looks to me like the problem is not Python but your perception of it.


From personal observation I'd say that for every person writing major applications in python you have at least 20 using it for glue code. And having written a major application in python, I'm not really convinced it's a good language for it.


> From personal observation

Which is not a good metric. Have there been any actual surveys or studies of Python programmers to see what the distribution is?

> having written a major application in python, I'm not really convinced it's a good language for it.

Out of curiosity, what was the major application?


Nope no metrics, I doubt very much they exist. But if I can pile on a second personal observation, a significant proportion of people programming python aren't python programmers. They'er engineers, artists, scientists, analysts and statisticians whom are using python to do their job. So even if you somehow did manage to survey a statistically significant number of python programmers, you'd still be missing a large swath of the python community.

As to the application, it was a largely in house tool for an engineering and company I used to work for.


> even if you somehow did manage to survey a statistically significant number of python programmers, you'd still be missing a large swath of the python community.

Depends on how the survey is done. If those engineers, artists, scientists, etc. are using Python, they're using Python, and there are ways to survey that. For one thing, you could look for users of the major Python frameworks listed in the Wikipedia article I linked to elsewhere in this subthread, like NumPy or SciPy.

If you want to claim that, for example, a scientist using SciPy isn't a real "python programmer" because he is only writing "glue code", I've already answered that in another subthread. Application logic is not "glue code". The scientist's application logic is in Python, so he's programming in Python, even if his job title doesn't say "Python programmer".


>Which is not a good metric. Have there been any actual surveys or studies of Python programmers to see what the distribution is?

So where are your metrics?


http://en.wikipedia.org/wiki/List_of_Python_software

This doesn't show how many programmers work on "glue code" vs. "major applications" (which is why I asked if anyone has data on that), but it certainly shows that there is a lot of Python code out there that is not just "glue code". Bear in mind that it's not just the particular packages listed on that page, since many of them are frameworks; there are also plenty of "major applications" written in those frameworks.

(I am counting major websites using a web framework, major numerical applications using NumPy or SciPy, etc., as "major applications", btw, not "glue code". I suspect on re-reading this subthread that the poster I was originally responding to might differ with me on that point, but application logic is not "glue code" in my view.)


>but it certainly shows that there is a lot of Python code out there that is not just "glue code"

Well, of course, this list would have only NON glue code projects!

Glue code stuff is used internally, in companies etc. It's not (usually) something that you put on GitHub or open source.


> Well, of course, this list would have only NON glue code projects!

And libraries and frameworks that support non glue code projects. Where is the list of libraries that are glued together by all this Python glue code?

> Glue code stuff is used internally, in companies etc. It's not (usually) something that you put on GitHub or open source.

People don't put their Django websites or their internal SciPy models on github either, so I don't think this means much either way.


In every Python book I read they were recommending me to use Python 2.7 instead of 3.0. Shouldn't it be the role of educators to promote the latest version out there and work with it, so that by the time the new students get into a working environment they are already on the same page with what is required?


Prior to the release of 3.2 (Feb. 2011, 2.5 years after the initial 3.0), that was clearly reasonable advice. The io subsystem in 3.0 was quite slow, making it uncompetitive with 2.6, and while 3.1 fixed that, there were still too many missing libraries (and similar issues).

I wonder what would have happened if the software that was called 3.0 had been released as "Python 3 Preview Release" or something. I guess the thinking was that would have held people back even more from trying it/porting stuff, but maybe scaling back (end user) expectations for those first couple of versions would have been the better path.


> In every Python book I read they were recommending me to use Python 2.7 instead of 3.0.

Well, for one thing, Python 2.7 is a newer release than Python 3.0 (2.7 was released after 3.1.)

However, while there may have been good reasons to choose 2.7 over 3.0 (or even 3.1), there aren't nearly as many reasons to choose 2.7 over 3.4.


> Shouldn't it be the role of educators to promote the latest version out there

I think the main role of authors of books on programming is to help their readers to become more effective in solving their programming tasks. So they do, if they're wise and honest.


> Python 2.7 instead of 3.0.

Maybe you need to read more up to date books. The latest release is Python 3.4, and even most Linux distros are at least at Python 3.2.


According to Distrowatch, out of the top twelve (12) distros, all but #8--Arch--use Python2.

The other 11 in order of popularity are #1 Mint, #2 Ubuntu, #3 Debian, #4 Mageia, #5 Fedora, #6 openSUSE, #7 elementary OS, #9 Zorin OS, #10 Puppy Linux, #11 PCLinuxOS, and #12 CentOS.

Most Linux distros use Python 2.


That's not quite a complete picture. Those distros include Python 2 as the default "python" executable, but many include Python 3 as python3, and even more have standard python3 packages that just aren't installed by default.


At work, I am writing tools in Python 2.6.6 specifically because the latest version of RHEL comes with only that, and I cannot get aftermarket versions of tools onto our servers. Is that the fault of Python? No, but I'm sure I'm not the only person working at a stuffy organization with crappy open source policy.


So do I (in some cases even stuck on Python 2.4) [RHEL 5]. But at the same time (re: Python 2.6) I never really thought "oh I really need the latest 3.4 or my project will fail". It doesn't offer dramatic speed improvement, concurrency, some nice built-in web framework or anything.

I would take 3.4 if someone else spent the time testing and validating my code against but I just don't have the time to invest because I just don't see an upside to it at the moment.


> and even most Linux distros are at least at Python 3.2

Why are you spreading lies? You are hurting your cause (advocating Python 3).

Let's take a look at latest Ubuntu:

   $ cat /etc/issue
   Ubuntu 14.04 LTS \n \l

   $ python --version
   Python 2.7.6
How about the popular server distro CentOS 6?

  $ cat /etc/issue
   CentOS release 6.5 (Final)
   Kernel \r on an \m

   $ python --version
   Python 2.6.6
What exactly are these "most" Linux distros everyone is using that I haven't heard of?


> Why are you spreading lies?

Why are you assuming the worst possible intent on my part, instead of asking what I meant?

I meant the distros have Python 3 available, not that they make it the default that /usr/bin/python points to. Ubuntu has had Python 3 available for quite a number of releases; I run 12.04 which has Python 3.2 (and I've had it installed since I installed Ubuntu). According to the release notes[1], Ubuntu 14.04 makes Python 3.4 available, and work is ongoing to make it the default Python for Ubuntu. They also advise porting to Python 3.

It looks like CentOS makes Python 3.2 available, but I can't be sure from their online documentation.

[1] https://wiki.ubuntu.com/TrustyTahr/ReleaseNotes


> most Linux distros are at least at Python 3.2.

I don't know how else to interpret that besides "the base version of python in those latest distros is at least 3.2"

Shipping it as an "alternative" just splits the library world into to. So that is not a distro being "at" a python version.

> It looks like CentOS makes Python 3.2 available,

It does. Do all the python-* libraries work with it. Or do I have to install those separately?


> I don't know how else to interpret that besides "the base version of python in those latest distros is at least 3.2"

Then you could have said that, or asked if there was another way to interpret it, instead of starting with "why are you spreading lies".

> It does. Do all the python- libraries work with it. Or do I have to install those separately?*

If CentOS works like Ubuntu, then there will be python3-* libraries for use with Python 3.


> "why are you spreading lies".

But I also didn't say you are spreading lies maliciously. You can spread them unknowingly via mis-communication.

I could post something like "Large parts of the Linux kernel are written in C++". And it would be spreading lies. But it could be just be me being confused as well or thinking that a C++ compiler will compile C and that C is a largely a subset of C++.


> I also didn't say you are spreading lies maliciously

My objection wasn't that you imputed malice; it was that you went straight to "spreading lies" instead of expressing uncertainty about what I meant. And you were wrong about what I meant.

> You can spread them unknowingly via mis-communication.

Which, since you were wrong about what I meant, is what you did when you said I was "spreading lies".


Many distros include both Python 2 and Python 3 support. I think pdonis is saying that distros that support Python 3 are not stuck on 3.0. That is, their Python 3 packages are 3.2 or higher, so comparing 2.7 to 3.0 is unreasonable, since it's not a choice anyone has to make. Which seems to be true. For example, the relatively conservative Debian is currently on Python 3.2, with Python 3.3 in the next version. Ubuntu's python3 is currently 3.4. Fedora appears to be 3.2. Arch has Python 3.4, I think.

So what pdonis was saying is, as far as I can tell, true.


They do. But python3 is not the default python.

Download a python script run it and it will run on Python 2.7.6.

Yes that OP meant to say that because he clarified it later. But I don't see how his original comment of:

> most Linux distros are at least at Python 3.2

implies that. To me that says 3.2+ ships as the default version of Python.


> python3 is not the default python.

The post I was responding to talked about books on Python for students, presumably students studying programming. Python 3 doesn't need to be the default python for students studying programming to use it; it just needs to be available. If the post had been talking about, say, sysadmins, that would be a different story.


I think it's recommended for distros to modify the shebang to /usr/bin/env python3 for script that are supposed to run under python 3, although this does pose a problem for scripts that want to run under both without modification. If the distro ships with python3 and most of the system scripts run under python3 I'd say its the default.Ubuntu for one is planning on not shipping a python2 or /usr/bin/python binary by default(although it will still be in repos) eventually.


[deleted]


Sorry about that. I wrote that comment in too much of a hurry. Corrected now.


I think 3.0 was meant to be 3.x not 3.0 specifically.


You should try to run

    $ python3 --version
...instead.


Will the programs or scripts I download know to to that?


Any script worth it's salt that depends on a newer feature of Python 3 can just check `sys.version_info`. This was a thing one had to consider back when new versions of Python 2.x were coming out, too.


It's not quite that simple. Not all code that runs on Python 2 will be syntax-compatible with Python 3, and vice versa. You can, with some effort, use a subset of the Python syntax that is compatible with both interpreters, but many libraries do not do that, but ship separate Python 2 and Python 3 versions if they support both.


Syntax changes can be an issue for backwards compatibility, but that's not what I was talking about.

It's easy to specify what the minimum version of Python you want to support is. Even if there's syntax which is incompatible with Python 2 in the file, a hashbang that specifies `python3` is pretty clear.


Yes, #!/usr/bin/python3 is clear, but I don't think it addresses the question you were responding to, which was whether the scripts the OP of this subthread was talking about, which he knows will run under Python 2 (because he downloaded them and ran them under Python 2--he mentioned this in another subthread), will know to check for the presence of Python 3 and use it. Any such script will have to check sys.version_info to know which version of Python it is running under (and for at least one version the shebang will be irrelevant anyway since it can only specify python or python3, not both), and it will have to be syntax-compatible with both versions. If the script isn't syntax-compatible with Python 3, then he'll have to go find and download a version that is, if it exists.


Why? Languages are tools used for communications, if there is no one to talk to there is little point in learning a language.

English is a much better universal language than esperanto ever will be for this very reason.


There was a link to a Microsoft person in the top right of your page. I mistook it for the author link. Edited post to remove that.

You are right that the same tools work for python 2. However, I mentioned tools there which allow python to do type checking, and type inference because they can be done now. See here on how to use them: http://renesd.blogspot.de/2014/05/statically-checking-python...

You're right in that case where you use several libraries that aren't ported. However, there's 4799 packages registered on pypi as supporting python 3, and most of the popular ones are ported. Considering that most libraries are ported, porting small scripts should be easy most of the time. I don't know if you actually ported any code, but it's often trivial.


4799 out of 44198 are python3? just a few more to go.


Not sure if you're just trolling, but the majority of the rest are probably close to being abandonware.


>Python is glue code. You write a small piece of Python to tie together external libraries.

Glue code is only one of the things that Python can be and is used for. Python is used for small, medium and large [2] applications as well.

Do all of the following look like glue code?

Recent marketing brochure from the PSF (Python Software Foundation) - (it's a PDF):

http://brochure.getpython.info/media/releases/psf-python-bro...

Python success stories:

https://www.python.org/about/success/

Google's web front-end (many of their URLs even end in .py)), Dropbox clients (wxPython), YouTube [3], Disqus are some apps / sites written in Python.

[1] See the paragraphs about YouTube near the end of this post (by me, but much before this current thread):

http://jugad2.blogspot.in/2013/03/youtube-dl-yourube-downloa...

[2], [3] The PyCon talk below is by one of the original engineers at YouTube.

https://us.pycon.org/2012/schedule/presentation/128/

The Wikipedia article below says YouTube is one of the largest web sites in the world. A lot of YouTube is written in Python.

http://en.wikipedia.org/wiki/YouTube


Do you have examples for these critical 3rd party libraries that haven't yet been ported?

As I noted elsewhere in this thread, the list gets shorter every day:

http://python3wos.appspot.com/


If you are running multiple automated eCommerce companies that have their core system written in Python 2 like I do, you'd find that boto/mws and python-excel aren't available for Python 3 (python-excel's porting to Py3 stopped in 2011), which would prevent you from spawning new businesses on Amazon or handling (unfortunately standard) MS Office interchange files quickly.

I actually made the decision to run on Python 2 due to unavailability of these libraries for Python 3 about a year ago (I didn't really care about Python before as I primarily used C++/Java/Asm/GLSL/C#/Haskell in the past) - I was deeply shocked by what Py3 did to its own community and still have that weird feeling that Python is slowly killing itself. I didn't want to write my own Amazon and MS Office wrappers in Python 3 anyway and from all platforms available Py2 seemed to have the least amount of troubles (though I had to write some Py3 code for handling advanced TLS properly).


yup. I think this is the reason so much eCommerce software is Java (& I'm sure there is a bunch of C# but that's not my thing). Java moves like a glacier but damn is it ever stable, and for those who wish to be "cutting edge" there is everything-under-the-sun on the JVM -- Clojure, Scala, Jython, JRuby, slimmer-by-the-minute Java APIs. And the community is just working their lil buns off trying to modernize this stuff while maintaining interop / backwards compatibility.

I think the flack Java gets is for not modernizing fast enough, but I think in a way that makes you stop and smell the roses (the roses being many years' worth of badass JVM features and massive flexibility/reliability in software development).

But Ruby & Python (I am an ex-Ruby guy) take this cutting edge thing wayyy too far & end up with a community of people who can't trust them anymore cuz they all had their damn feet shot off. Sometimes I see the rates for Ruby guys & wonder if they're high cuz its so productive or just cuz its very niche & the stack is shite. On principal, it's hard for me to imagine going back down that road after experiencing the JVM (plus, I'd like to get away from web one day...).

EDIT: Funny thing I just realized actually -- all the Ruby rates I've seen tend to be exactly on par with a lot of FUBAR niche near-extinct Java technologies. The only people I've seen do better than both these camps are down & dirty Linux/Oracle/Java guys. Maybe my view is skewed but I always feel like the rock-solid langs will live on & eventually "modernize" (though lets be real, modernize is kindof a BS term for slim API & lots of MVC tutorials). These other techs are at the mercy of market trends.


I worked at SUN in the distant past, the backwards compatibility was the mantra #1. Both latest Java and Solaris were expected to run the oldest possible code, and they considered it one of the most important assets.

For many people in eCommerce and banking the arrival of Java was a "manna from heaven", as before they had to deal with the crazy complexity of CORBA and similar technologies. Java simplified this significantly and added at the time modern easy-to-program and fairly safe multi-threading that allowed software like Apache etc. This made Java famous and gave it a "huge karma" amongst developers. Without Java we probably wouldn't have experienced the massive expansion of eCommerce, and now we are lucky to see all the Java's issues - a good problem to have ;-)

Java and JVM as a consequence have their share of problems that arose from initial design decisions such as type erasure (which .NET's CLR avoided) that still deforms all languages based on JVM to this day. Massive boilerplating, J2EE's unreadable XML mess, complex design patterns invented to cope with the limitations of the language are some of the other problems. The compatibility with the old code is still there though.

I've never played with Ruby, I've heard it's still used by many young web companies for their frontends (like SoundCloud) and have heard they had some release that broke most of the compatibility which alienated many of their users. From this and from Python's example I would suggest keeping both the old way and introducing the new ways in parallel so that developers don't have to rewrite/reinvent the wheel when somebody decides to change a part of the language. This served C/C++ pretty well, adding slowly new features and instead of forcing users to use the "current right way", they allowed to live multiple different approaches together. I think that would have served Python well as well - even Py3 feels at places like a hack, similar to Py2.


Java is simply becoming the next C. For a reason, in my opinion, since the languages share quite a few traits both on the language level and the ecosystem level. Both are fast. Both have a rock solid execution system which sometimes feels backwards compatible to the first punch card made of rock (gcc vs JVM). Both of them have huge communities having significant infrastructural services and other huge code bases developed in them (linux kernel, application servers, databases in both teams). The big improvement of java would be a higher programmer safety.

And to be fair, neither language is the prettiest, shiniest and fanciest for it's time, but they get the darned job done.


you (& bitL as well) make some fair points but I actually (maybe because I'm newer to Java?) have quite a different experience with it. The XML setup seems minimal now (sure it's all there but most stuff is now handled with reasonable default configs & annotations).

Take Jax-Rs is a good example. Once you understand it you're just like "Ohhhhh this is the same way Rails/whatever framework handles json requests but I can add in all this non-linear logic without forcing it to play nicely within a presumptuous design pattern". Adding many data sources & logic flows is easy.

Bottom line is that I feel like Java can never really be the next C because fundamentally its much safer and becoming way more human readable. It toes the line well -- you can code minimal stuff similar to the scripting languages, or you can drop years of work/research into optimization for high-transaction algorithms knowing that the platform won't abandon you.

(By the way, I am personally hoping for Clojure or node.js to take over the front-end of the stack & then just using Java to do any heavy lifting that is left over)


I just looked at that list and I see several prominent projects that (according to that list) don't support Python 3: MySQL-python, gevent, Twisted, eventlet, oauth2, thrift, nltk, mechanize, etc. I've used more than a few of these libraries over the years.

I mean, yeah, the list is getting smaller. That's understood. But that doesn't render the problem of depending on only-Python 2 libraries moot.

I think the fundamental insight of the OP is that Python X (the purgatory corresponding to code compatible with both Python 2 and 3) is an absolute pain to write. There's plenty you can do to make Python X easier (like the `six` library or ignoring Python 3.{0,1,2}), but the programmer is still left with the burden of writing code compatible with both versions.

There's just no denying that the Python 3 transition is a huge fucking pain that is costing people tons of time and money. With that in mind, it is absolutely reasonable that people are resisting the change.


Twisted is mostly ported, and you can install via pip and pypi. Some of the more ancient parts remain to be finished however.

gevent is ported, but not in an official release.

There is a mysql alternative module.

mechanize is ported, but not in official branch

nltk alpha3 for python 3 is available

There are python3 thrift ports, but the maintainers are slow.


Yes, I knew most of that. But the fact remains that they aren't complete yet and therefore can't actually be relied on as a dependency.

(N.B. I seem to recall someone telling me that gevent had been ported over a year ago and that all that remained was to just make a release. So count me skeptical until the port is complete.)

And that's just the most popular projects. Imagine what it looks like with the less popular projects? It isn't pretty. This is what the OP was talking about with the "tail" of Python projects.

N.B. I don't personally have this problem with less popular projects. I've always done my best to avoid depending on unmaintained or inactive projects because they tend to rot as time goes on (irrespective of massive changes like Python 2 and 3). But plenty of other people have this problem and it's definitely not an unreasonable problem to have.


Off the top of my head: Twisted and gevent. It was actually the lack of a Python 3 gevent that lead my to create one crucial piece of software for my startup using NodeJs communicating with the main Python application passing messages via Redis.


Twisted and gevent are available for python 3.


Look, I'm a Python 3 supporter, but you should never foist the Python 3 fork of gevent on anyone.

I'm pretty sure that, the monkey-patched way most people use it, it's a happy accident when gevent even works correctly on Python 2. This happy accident doesn't happen in an unofficial fork.


Available, yes. Working? No.

  File "/usr/lib/python3.2/site-packages/twisted/internet/_sslverify.py", line 1389
    self._cipherString = u':'.join(
                            ^
SyntaxError: invalid syntax

(Twisted 14.0.0, installed with pip)


Indeed, 73% of popular packages in PyPi already support Python 3 http://py3readiness.org

And it's getting better everyday.


Numpy, Scipy, NLTK.

These are really HUGE and I can not do my work without them.


And of those, only NLTK isn't compatible with Python 3.


To clarify: NLTK develop branch is compatible with Python 3.2+ for more than a year. NLTK 3.0a4 (http://www.nltk.org/nltk3-alpha/) is not released as NLTK 3.0 because of some pending API changes, not because of Python 3 compatibility issues.


i use NLTK with python 3.4 (and before that, 3.3); NLTK has an alpha release for pyhthon 3; i have not had any problems either installing it or using it.


Ok so why would I risk hitting even one incompatible library or have stuff break because those are dev experimental, untested branches. Why? What does Python 3 offer me except risk of breakage?


wxPython is a big one for me. But it looks more like it's dying altogether rather than not getting ported to 3.


Have you heard of python3-wx3-phoenix?

It is not dying. It just gets better.


I've heard of it, but there has never been an official release. I was looking forward to it for a couple years but I've given up.


Did you read the article? The long tail is the critical collection of libraries and most of them will never be ported. People write new Python 2 libraries every day, actually--the list is getting longer, not shorter. That said, gevent is an extremely critical library for a lot of production Python 2 code. There doesn't even seem to be a roadmap to port it to Python 3.


gevent is ported to python 3. I guess a proper release would be soon. See here: https://github.com/surfly/gevent/issues/38

If the libraries are used by people, they tend to get ported. Like most of the most popular packages, and 4799 packages listed on pypi.

Any others you depend on? I'm prepared to roll up my sleeves and help.


If you genuinely want to help, I suggest porting launchpadlib to Python3: https://bugs.launchpad.net/launchpadlib/+bug/1060734

This is affecting other parts of Ubuntu, as we don't always install python2 anymore after porting all the default apps to python3. The lack of python3 launchpadlib means apport (the automatic bug reporting tool) doesn't work on some installations.

edit: porting launchpadlib requires first porting lazr.restfulclient, which might be genuinely challenging: https://bugs.launchpad.net/lazr.restfulclient/+bug/1000801


Let's assume a modestly sized Python program uses 8 libraries or so. Assuming 80% of libraries are ported, that means 20% aren't. The likelihood of at least one of your 8 libraries not being ported is 1 - (0.8)^8 = 83%. In other words, most people will find that at least one of their cricitcal libraries is not ported. That is certainly the case for me (Bloomberg Python API, for 2.7, and which, btw, only came out 6 months ago). Inevitably the people using Python come from specific domains, and use at least one domain specific library, so while all the common "usual suspect" libraries are ported, there is usually a very high chance that one critical library is not.

Perhaps the way the entire question should be thought of is: are there actually many properly useful Python 3 libraries that are not available in Python 2?? I bet the above equation would look much better when inverted.


I'd love to have scapy for Python 3. A lot of ports of it have been started but it seems none of them reached a working state.


There's a large number of significant commercial applications that use Python as their scripting and plug-in language. Non of them that I am aware of are showing any interest in Python 3.


Also: Disqus, written partly in Python, mentioned on the High Scalability site:

http://highscalability.com/blog/2014/4/28/how-disqus-went-re...


> Not much has changed in python 3? What rubbish. You just have to look at the thousands of commits, and change lists.

Sorry, I'll have to disagree with your disagreement. Thousands of commits doesn't mean it is worth breaking backwards compatibility. Unicode support? Yeah it is nice. But it worked fine for me before. Print is a function. Again, pretty nice, not nice enough to break backwards compatibility.

> Python 3.4 has generic functions to do single dispatch matching on type.

Ok. Never really needed that in Python. So again not worth breaking backwards compatibility.

> There is the new asyncio library for really elegant async,

That is a bad idea. Promises and futures are not a very good concurrency construct. Light threads, processes and message passing are better. Twisted has been around for ages, and even though it is used here and there it hasn't exactly taken the world by storm yet. So I certainly don't see it as a positive, rather it is a negative. I would rather they adopted greenlet and supported gevent better or integrated PyPy's STM

> We have CFFI for simpler FFI.

Ok, I just used CFFI in Python 2 last week, worked great.

> Lambdas are kept simple for readability, as functions are preferred.

Heavy lambda usage is not that good of a practice in Python. And Python 2 has lambdas as well. It use them sparingly.

> I'm just going to stop here..

Oh no, please don't stop, because you haven't convinced many people Python 3 is better.

> You can statically type check python now (see pysonar2, pylint, and pycharm for example)

You know what has better static type checking? Go. I am sitting down looking at static type checking and fixing code for backward compatibility, I will also be evaluating other languages and platforms.

Python 3 is not bad. But is is not that much better than Python 2. Sorry. A lot of that is not Python 3's fault as much as a testament to how good Python 2 already was.

I think Python 3 should have offered a lot more than unicode and print as a function. A list of possible improvements:

  * A JIT compiler

  * An STM module
  
  * Better concurrency (removed GIL)

  * Maybe integrated requests or flask or some other modern web tools and libraries


Ridiculous. Python 2 has many many warts and the right thing to do was to fix them. Why the heck is print a statement? Why do we have urllib and urllib2? Why are generators not that useful?

>> That is a bad idea. Promises and futures are not a very good concurrency construct. Light threads, processes and message passing are better.

Not so much in Python I'm afraid. The new asyncio module is worth the switch by itself I find, while Twisted is pretty good it can be very verbose and suffers from a number of issues.

You say that it's not worth breaking backwards compatibility for genuine warts in the language and seem to think it's only worth progressing the language if its by implementing huge changes like adding a JIT or removing the GIL (why you would think of bundling requests or flask with Python is beyond me). Python 3 may not have those big blockbuster features but its a nicer language with fewer warts (from __future__ import division? please).

I gave Python 3.4 a try while porting a Twisted application to asyncio and I was very surprised. After using it in a couple of Django projects I have yet to come across any packaging issues, and I use the awesome 'six' library so my code runs on 2 and 3 without modification.


> Why the heck is print a statement? Why do we have urllib and urllib2? Why are generators not that useful?

So what if it is. Why not just make prnt a function. Or println or something.

Have urllib and urllib2. Make urllib2 or just put requests in the stdlib.

Still don't see what was worth breaking backward compatibility over.

> You say that it's not worth breaking backwards compatibility for genuine warts in the language and seem

I say it is not worth doing it at the time they did. It just came too late. People how didn't pick python over other frameworks probably weren't put off by the print statement.

This should have happened 5-10 years ago not now. Not it was time for better performance, packaging, web frameworks, better concurrency constructs.

> ts by implementing huge changes like adding a JIT or removing the GIL (why you would think of bundling requests or flask with Python is beyond me).

Removing the GIL shouldn't break sane code. With GIL you still have race conditions and have to guard against them. You just can't run CPU bound code in parallel. That would be nice to fix.

And if it is beyond you read up on how JPython works because it doesn't have a GIL.

> why you would think of bundling requests or flask with Python is beyond me)

How does bundling a library like that break code?

> I gave Python 3.4 a try while porting a Twisted application to asyncio and I was very surprised.

I switched to using gevent from Twisted and saw speed improvement and code reduction by about 30-50%.


> So what if it is. Why not just make prnt a function. Or println or something.

Make another function to print output while also having a print statement? "There should be one-- and preferably only one --obvious way to do it". Same with urllib. Lets just be clear, they didn't release Python 3 because urllib and urllib2 were not combined, so no by itself it's not worth breaking backwards compatibility for (and hence why there were 2 urllib modules). But combined with a bunch of other issues, some of which necessitate breaking backwards compatibility, why not combine them?

> I say it is not worth doing it at the time they did. It just came too late. People how didn't pick python over other frameworks probably weren't put off by the print statement.

Fair enough, but better late than never right? You can't maintain backwards compatibility forever especially with syntax changes. That's why it was put into a major release and the 2.x branch continued to develop. Porting the code isn't hard (there are even automated tools to do it, like gos gofmt command), and I think every major web framework supports py3 now.

> How does bundling a library like that break code?

The best things about python packages is they can be updated independently of the standard library that ships with Python. Why would you want to add flask or requests to the standard library? Is executing one shell command too hard? Including popular packages just because they are popular at the time is not a good thing to do.

> And if it is beyond you read up on how JPython works because it doesn't have a GIL.

You mean Jython? That's a silly comparison. CPython is the reference implementation for Python. A JIT compiler could be added to it, sure, but actually think about it for a second: writing a JIT compiler for Python is a huge undertaking. PyPy has one, but it's not 100% compatible (no eval and others for example, which is sometimes used) and for some workloads slows down execution. Not to mention PyPy is a very advanced project and it has taken years and years of development by people specialized in writing JIT compilers (remember Psyco for Python 2.5?) to get PyPy to where it is now, expecting the Python developers to whip up a usable JIT compiler in a jiffy is not exactly realistic. Adding one would add huge complexity to the reference implementation of Python, the same goes for the GIL. Read here[1] for reasons why the GIL will never be removed and use the awesomely simple multiprocessing library to spread work over multiple cores, that's why it was made.

1. https://wiki.python.org/moin/GlobalInterpreterLock


> "There should be one-- and preferably only one --obvious way to do it".

That or split the community and spend thousands and thousands of man hours, coding, planning, talking about, arguing on HN over a print statement vs function. I'll take a 2nd function over it any day.

> The best things about python packages is they can be updated independently of the standard library that ships with Python.

That is a separate issue. I was just responding to you saying how adding flask or some web framework or client would break code. And I think we can agree that adding a module won't break existing code if it is not using, wouldn't you agree? Now ok, maybe adding something like that is a bad idea as you said, as it will become ossified and kind of stuck (including bugs and warts). Ok, sorry that was a bad example. I was just trying to think if big enough thing that _would_ have justified switching to a new version.

> You mean Jython? That's a silly comparison.

Sorry Jython. Why is it silly? If the effort put into PyPy combined with effort put into Python 3 transition (including time wasted arguing about it, porting, designing shaming websites "check out these unported Python 2 projects") were put into it removing the GIL it would have probably worked.

> Not to mention PyPy is a very advanced project and it has taken years and years of development by people specialized in writing JIT compilers (remember Psyco for Python 2.5?) to get PyPy to where it is now

Yes, I know I was using Psyco back in the day and every time I was using (and I did measure and it did improve the speed quite a bit) I was think why the heck is this not in the base distro?

Remember, I didn't say this hypothetical "better" Python 3 had to have yet another JIT separate from Psyco, Jython, PyPy, Unladen Swallow (LLVM). It could have been any one of those. And that would have been a worthy enough carrot to get people more excited. The port would have taken less time, with less drama.

> 1. https://wiki.python.org/moin/GlobalInterpreterLock

I know it is hard but I didn't say any of those would be easy. It is in fact kind of hard to beat Python 2 (or rather to offer something a lot more compelling to justify switching to it).

By implication here, not switching to 3 would have been a valid step as far as I see.


whats a good and performant mysql library for python 3 to use? or did you use postgres for all those projects?


I used postgres for all the projects I'm afraid, but the Mysql team provide a db-api compatible library here: http://dev.mysql.com/downloads/connector/python/ that supports Python 3


How about oursql (https://pythonhosted.org/oursql/)?


I agree. Python 3 fixes a lot of warts (range() defaults to xrange(), input() is the default) and does make the language overall more elegant (arbitrary head and tail unpacking with splat) but unfortunately it's not really worth a full switch. At least not for me.


A lot of the goodies from 3.3 were back ported into python 2.7, to ease portability. Python 3.4 makes it more backwards compatible still (whilst staying elegant). However, now there are also some new features that are not getting back ported.

Using argument annotations for type checking is quite useful in IDEs and for statically checking your code.

Asyncio, and yield from have been in the design phase for many years by Guido and the community. This is seriously well thought out concurrency done by people with lots of experience. It's the 4th generation async framework for python.

The better unittest framework saves me time, and helps me with error handling.

pip installed everywhere is a serious productivity improvement.

Function dispatch based on type (ie, pattern matching like some functional languages) is something I've missed for years. It was originally planned for py3k, and I think is the main missing piece from the original design docs.

Importlib is much improved, so you can do some really interesting things with importing code. If you're working on IDEs, or other apps like games this is pretty cool stuff.

Little things like the enum module make APIs that tiny bit cleaner and more intuitive to use.

Speaking of nicer APIs, the Documentation got a lot of fixups too in Python 3.4.

Want to track down memory leaks? Python 3.4 has a whole new set of memory debugging primitives to make this much easier.

There were also a lot of security improvements for 3.4 that have not been backported.


This just seems like an ad hominem attack. Just because the author works at Microsoft does not mean that he is speaking for Microsoft or talking on their behalf.


It's not just an ad hominem attack. I counter a number of his claims. The author provides NO evidence for any of the rubbish he's talking about.


There are barely any users of Python 3. New versions of Python 3 are mostly only picked up by other Python 3 users. Python 2 is growing faster in downloads than Python 3 is by PyPI numbers.

That is a problem.


Please make a new extension `py3` and run both on the same VM. I know the str|unicode|buf issue is not easy, but you guys have lots of smart people.


Not possible due to how the interpreter and extension interface is structured.


I have been racking my brain on how to remove all of the externs from the Python 2.7 VM, I think it just needs a big refactor to have a context object like Lua.


Do you have a link for these numbers?


I tweeted about this a while ago, I don't have the numbers at hand. Unfortunately to get those numbers you need to nudge the pypi maintainer as the data dump is too large to keep in a database without wasting money. As such if I would want to run the report again I would need to import multi gigabytes of logs again which takes time.


Currently python 3 packages ported are at about 10.9% according to pypi. This isn't including separate source downloads, so I guess the number would be higher by a few percentage points.

There was a post by Alex Gaynor in January [0] about downloads for python 3 being at 2.4%. But that was before python 3.4 came out, and before some major libraries were finally ported like Twisted. Twisted not being ported was one of the main blockers for python 3 in the last Ubuntu release [1]. Once more distros improve their python 3 support, things will get exponentially better for python 3. Continuous integration servers inflate numbers because they re-download things, and there is the CDN which makes all these numbers highly inaccurate.

Lots of things have been ported since January, like pycurl, pyOpenSSL, bugzilla [2], etc. Because of the big pushes by Debian, Ubuntu, Fedora, Mozilla, and Gnome [3] communities amongst many others. Things have improved greatly in a few months.

Distros, and apps like Sublime Text 3 moving to python 3, along with some of the major missing pieces will help uptake significantly. Almost all packages in Fedora come with python 3 packages now. But the key to increasing python 3 adoption is Apple, and python 3 on OS X. Many of the python packages that come with OS X are now ported to python 3 (Twisted, cups, pyobjc, etc) ... but who knows what they have planned? I guess we'll see June-ish.

[0] http://alexgaynor.net/2014/jan/03/pypi-download-statistics/ [1] https://wiki.ubuntu.com/Python/FoundationsTPythonVersions [2] https://fedorahosted.org/python-bugzilla/ [3] https://wiki.gnome.org/action/show//Initiatives/GnomeGoals/P...


Only a subset of twisted is ported. For example, twisted trial runner is not ported, so you can't run tests using twisted test runner in Python 3.


Yeah. Here's the issue for it "Port twisted.trial.runner to Python 3" twisted.trial https://twistedmatrix.com/trac/ticket/5965


I'll take your word for it then ^_^


Poster mentions msft dropping support for ironpython (there were other languages too - ironruby, for instance) -- where "drop support" was to open source them.

Just last year, msft gave $100k to the ipython project -

http://ipython.org/microsoft-donation-2013.html

It feels like the original commentor has a beef against msft to read company action into what is an individuals blog entry...


If they open source and cease to support, I doubt anyone in the Microsoft ecosystem would pick it up.


Most of the article (on medium) is an unsubstantiated rant. Granted that the move from 2 to 3 has been much slower than what you see for major revisions but it was a _big_ overhaul. In hindsight, maybe the developers could have done something different. But using phrases like "Python 3 is killing Python" & "you might as well port to ruby" is absolutely false & misleading.


It killed it for me. (So not absolutely misleading)


I'm afraid that still won't kill Python.


The OP did not say it would "kill" Python, just that it would cripple it. Personally with bloomberg bringing out a Python2.7 financial module first, no 3.x support, the entire financial world is now also poised to stay on 2.7. That's a classic example: bloomberg only published this Python binding 6 months ago! no Python 3.0 for them.


I was referring to the title : "Python 3 is killing Python"

Anyway, it's a well known fact that 2to3 switch has tremendous inertia. Space, Medical, Finance are few of the applications where often old is gold (tried & trusted). What bloomberg did was logical, no surprise there. That decision adds little value to how Python 3 is impacting the growth of Python.


It is tremendously detrimental when a deprecated technology has all the inertia, because it is a dead end thanks to the Python bosses refusing to move it forward. Every new package for 2.7 (and no this is not old gold - this is a brand new library for access to Bloomberg's massive and uber-valuable database), every new package that invites more people into a dead end, encourages those people to look much wider (beyond Python) when needing to upgrade. Python 3 may be growing in absolute terms, but in relative terms it's a absolute dog compared with its competitors, including Python 2. Python 3 in its current form is doing everybody a huge disservice.


> deprecated technology

Are you calling Python 2 deprecated? Could you provide a reference?

> not old gold - this is a brand new library for access to Bloomberg's massive

You clearly missed the meaning of 'old is gold' here. It's the 2.x Python code I'm referring to.


I see what you mean now on old is gold and yes I agree. Corporations love old proven tech for all the obvious reasons. Not officially deprecated but that's just semantics. The underlying point is that the Python leaders are trying to turn a superhighway into a dead end, and force a long and winding detour for most people onto another highway which is only marginally better. Being forced to take that detour simply encourages the exploration of other alternatives as well. By the way if they really had balls, they would've killed Python 2.7 in 18 months. Not 5 years. Force the choice. Clearly they themselves are not confident enough in 3.


"I doubt the lazy people who haven't moved to python3 will ever do a decent port of python 2."

That is a ridiculous statement. So everything like most of the *nix OS and a dozen other super popular products that require python 2.6 or 2.7 are lazy too?


> I doubt the lazy people who haven't moved to python3 will ever do a decent port of python 2.

These lazy people probably have a day job where they have to do something productive instead of wanking around with "fun" language features.


Not sure how accurate is this source but it shows:

PHP 82.0% ASP.NET 17.5% Java 2.7% ColdFusion0.8% Perl 0.6% Ruby 0.5% Python 0.2% JavaScript 0.1%

http://w3techs.com/technologies/overview/programming_languag...


Not very accurate. PHP is basically the only language which advertises itself both in file extensions (less and less common as people started using rewrites) and HTTP headers (unless disabled of course).


I really doubt there is more ASP.NET code deployed than Java,i'm pretty they are at least equivalent.

And Coldfusion over Python and Ruby? i dont think so.


Reading the methodology behind that survey, I'm very skeptical. Basically, the reason that PHP and ASP score so high is that they're the only ones that explicitly advertise their presence via file extensions.


Reminds me of Ruby, IMHO the version jump from 1.8.7 to 1.9.* killed it. Different scoping rules made porting a non-trivial task, many packages have never been ported to 1.9 and one of the early 1.9.* major releases was extremely buggy...

Nowadays hardly anybody seems to use Ruby for new projects...


I can't tell if this is a poor attempt at sarcasm, but if not:

Do you have evidence to back up the claim that Ruby isn't being used for new projects? Even assuming HN is a skewed environment, I see enough Ruby projects popping up here to make that seem unlikely.


Yes, not sure how much evidence you need. I mean just look at the Tiobe Index:

http://www.tiobe.com/content/paperinfo/tpci/images/history_R...

1.9 was initially released 2007, 1.9.1 was released in 2009... Not sure if you remember it, but one of the early 1.9 releases was reeeeally buggy.


Quite on the contrary. 1.9-compat was very quickly expected of all libs and all important core libs run better on >1.9.

1.9 was the cleanup release after the language became popular and very necessary.

Also, everything before 1.9.2 was considered a preview release.


> Quite on the contrary. 1.9-compat was very quickly expected of all libs and all important core libs run better on >1.9.

That's quite some historical revisionism -- there was quite a while when quite a lot of libraries weren't running on 1.9. It wasn't as bad as with Python, probably because Ruby wasn't popular for as long as Python was, so there weren't as many big libraries that were hard to update. But it was a huge deal.

> 1.9 was the cleanup release after the language became popular and very necessary.

It was necessary for much the same kind of reasons (and, sometimes, the exact same specific reasons) as Python 3 -- particularly, the string encoding problem.


> That's quite some historical revisionism

I concur. The _expectancy_ that libraries should be updated was there pretty quickly. New libraries were generally created with both versions in mind. It was a huge porting effort, especially with 1.9.0 being very buggy, but there was a general sense in the community that you need to port or die.


At least the whole Google Appengine Ruby suite never worked on 1.9. Also JRuby took some time to be 100% 1.9 compatible.


We're at Ruby 2.1.x and every single gem I used in the past six months has been ported to it. I use Ruby only for Rails so I might be seeing a small environment, but it's 96 gems in the project I have in my editor right now. There are still many new projects in Ruby. I probably make half of my income with them.


> We're at Ruby 2.1.x and every single gem I used in the past six months has been ported to it.

Yeah, but 2.x didn't have major breaking changes. The big breaking change from 1.8.x that parallels the Python 2->3 change was 1.9, which has been out for two years longer than Python 3.


I don't think people are used to backwards compatibility breaks in languages. Though in some ways, that means you have a different language.

In reality, Python 3 is to Python 2 what Ruby is to Python 2 - different. They aren't the same language, you could have named Python 3 Anaconda (just because Cobra is already taken, I know it is a prepackaged library bundle for Python already) and its function would be clearer.

But you also can't tell people writing free software "no, you have to stop doing what you want, and do what I want" unless you are willing to negotiate how much that will cost you. And maybe they wouldn't do it for any price. You could get other developers to extend it on your doll as well.

But the developers of Python collectively said "unicode is important enough to make a backwards incompatible leap". They are never painless. OpenGL is a great example, with how the version 3 line was rife with attempts to depreciate the fixed function legacy cruft, and even today few developers properly differentiate compatibility and core profiles. That is a major version break too.

But if you don't like it, you can make your own. The fact there isn't a major popular python2 extension project trying to backport functionality unofficially that has gained traction should say what the popular consensus on python 3 is.

If you need python2 libraries, use python2. There is really no shame in doing it. Even when the official project depreciates the maintenance of 2.7 someone will pick it up, because there will be for decades, at least, and probably for eternity, python 2 software running somewhere and the maintainers think keeping python 2 working is less effort than porting their stack to 3.


I don't think people are used to backwards compatibility breaks in languages.

Nor do they like it, because most if not all developers are using these languages for a practical purpose: to get things done. Having to relearn and "fix" things which otherwise wouldn't need to be has a cost, and this is on top of what they already have to do.

I personally think backwards compatibility and gradual change - evolution, not revolution - is the way to go. Outside of the software industry this is the norm; think of all the "interfaces" that we use in the physical world that have remained backwards-compatible and relatively stable over long periods of time. Software is different because it can change quickly and (by itself) easily, but just because it can doesn't mean it should, as if it is well-established, there is certainly a large number of dependencies on it too.


"I personally think backwards compatibility and gradual change - evolution, not revolution - is the way to go."

I'm gonna guess that you're not new to C++. I say new because people who have been following it and picking up all the changes along the way often can't see the forest for the trees. That forest is ugly.


I'd argue C++ is doing it really well. They took their shitty awful C++03 base that made developers around the globe cry at night, and tried appending on top of that mess less terrible concepts.

And now, modern C++11/14 doesn't look anything like C++03. Which is a blessing and a curse - they effectively depreciated a huge swathe of old practices for better new ones (refs over pointers, smart constructs over raw new / delete, lambdas and auto over void* function pointers, soon were going to have concepts and static if instead of preprocessor ifdefs).

But that comes with a price. The price is that C++ is huge, the compilers are slow due to the complexity, and it is really analogous to per project technical debt.

D exists because C++ was too slow. Walter (and Andre) saw this huge shortfall in standard C++03 and this entire class of functionality that should be there, because it doesn't have a performance overhead, but wasn't. Maybe they also saw how C# and Java were useful and popular, but they put this massive VM under them to simplify the implementation and standardization.

Go exists because C++ was too big - for its purpose, of web server development, you wanted native performance with natural asynchrony and multi processing. It still doesn't have generics because it is pretty much a domain language, I think.

Rust exists because C++, in the wake of C++11, is still a complicated mess of syntax. And in order to maintain backwards compatibility, C++ is terribly unsafe and every line of code can destroy your program. I think Rust is as close to the future of native languages, because it does the C++ thing - a general purpose language, where everything is deterministic and predictable because it is direct binary translation between language constructs and assembler with limited magic - and does that backwards compatibility break to fix the significant flaws.

I still don't like the Rust syntax much, but it might one day grow on me. I don't think its really approachable though, I'd definitely be interested in trying a "pythonic" static typed language that emphasizes safety and productivity while maintaining the poster C++ ideology of providing reasonable abstractions that don't hide the details, but let the expert more intuitively interact with them.


Have you looked into Nimrod? It is different from Python in that it embraces some concepts that Python rejects (e.g., AST macros, which are actually quite brilliant in Nimrod) but it's the most pythonic programming language that I know of that compiles to native code.


Thank you for recommending Nimrod; I'd never really looked into it before. Nimrod looks like a cute little language, but it seems to lack a staple of many other languages: support for the '\t' character in indentation.


If the python developers wanted to spend the time to reimplemented bad behavior, they could have create a reverse "import from deprecated ...".

It would solve most of the problem, while encouraging a switch to the new'er version by forcing every user to type in "import from deprecated" in top of their script.

I just think they didn't want to do that, given the massive work and non-existent personal gain. However, any company that want to invest their money on it can since its open source.


This is a great idea!


The problem is not only Unicode.

If it only where for unicode, the problem would not so big. But they seemed not to bother at all about backwards compatibility at all in the first Py3 version. Later, they did try to change some things, that unnecessarily(!!) blocked adoption, but many projects already decided to ignore Py3.

I guess, you don't mean what you said in the last paragraph? Was it just a joke?

Unless you can replace all libraries that exist only as Python 2 version, you can not move -- you can use zero Python 3. That is a huge problem to real projects -- of course, if you only have some University (educational) - projects, all is fine and you can play around as you like.

Also there is the problem, that it is not trivial to have Python2+3 on the same installation. Most Linux distros still run with Py2.


> Also there is the problem, that it is not trivial to have Python2+3 on the same installation. Most Linux distros still run with Py2. >

Actually, it's trivial nowadays with projects like pyenv. You can even have different python versions in a per-project-directory basis.

Check out the installation guide (https://github.com/yyuu/pyenv#installation) and a commands reference (https://github.com/yyuu/pyenv/blob/master/COMMANDS.md)

There's also an extension that provides virtualenv integration - https://github.com/yyuu/pyenv-virtualenv


Thanks, that looks interesting. I will note it for the time, I have the problem.


This is the same problem addressed by the ancient, universally known tool named virtualenv. How are you delivering pronouncements about the unsuitability of Python 3 when you don't have solid knowledge of how things are done in Python?


I did not "deliver pronouncements about the unsuitability of Python 3". I did only say, that it is not trivial to have both versions on one installation.

And who are you, something like the "Python purity police"?

I will ask you something, that you did not work with till now -- and when you give a wrong or incomplete answer, I will also say "how dare you"!


In what distros is that a problem? I have Python2+3 on Debian running fine side-by-side.


When you have several Python programs, where does /usr/bin/python point to?

I must admit, that I did not think to much about it until now, but normally there is only /usr/bin/python in the scripts.

Of course, if everybody has /usr/bin/python3 inside his Python3 programs, everything could work (unless some of your apps need special environment vars like PYTHONPATH, what could make it troublesome again).

But still, the standard installation of Debian (and others, as much I know) does not include Python3 by default, you at least have to add it to your installation. That is something, you must tell your customers, when you ship Python3 programs.


/usr/bin/python should never point to Python 3. Because of the compatibility break, the convention (and default for the Python build scripts) is to always name the Python 3 binary python3, so there should be no conflict. It's understood that during the (very long) transition period there will be need to run both side by side.

A handful of ditros have gone ahead and named it python anyway. A stated motivation has been to encourage adoption of Python 3, but practically they're just shooting themselves and their users in the foot, and ensuring incompatibility with the shebangs defined for every other OS. (I don't think there's any plan to ever switch the default binary name from python3 to python, so this incompatibility may never be resolved.)


Arch switched from Python 2 to Python 3, keeping the /usr/bin/python binary as pointing to their newest version of Python before the convention of naming it /usr/bin/python3 was standardised.

It was specifically in response to the troubles caused by having /usr/bin/python be python3 in Arch that the guideline came about: http://legacy.python.org/dev/peps/pep-0394/


Thanks for the reference; I somehow failed to find the origin of this convention in previous searches.


Most package managers allow you to specify a dependency on Python 3, so you shouldn't even need to tell your customers. Otherwise, you can always use virtualenv and ship Python with your code.

And yes, if you're targeting Debian or Debian-like systems, make a deb. It's really not that hard, especially with newer tools like fpm.


Thank you for mentioning fpm, I'd never heard of it and often wished for something like it. Commenting to bring attention to it, and so I don't forget!


> When you have several Python programs, where does /usr/bin/python point to?

/usr/bin/python should always point to python 2 (or nothing) /usr/bin/python2 should always point to python 2 (or nothing) /usr/bin/python3 should always point to python 3 (or nothing)

scripts using a shebang line should not, however, rely on /usr/bin/python pointing to python2, and should only use it when they are source-compatible with both python2 and python3.

http://legacy.python.org/dev/peps/pep-0394/


> (just because Cobra is already taken, I know it is a prepackaged library bundle for Python already)

Actually, the name really is taken and by programming language: http://cobra-language.com/ - looked nice when I last looked.

I didn't know about Cobra Toolbox, however.


Was going to reply to the parent of your comment, but since you beat me to it:

>Actually, the name really is taken and by programming language: http://cobra-language.com/ - looked nice when I last looked.

The Cobra language does look interesting [1]. Checked it out a little a few weeks ago. I think it only runs on .NET and Mono, though.

[1] It uses ideas from several languages, including Python. On the Cobra site, there are a few short comparisons between it and other languages, including Python:

http://cobra-language.com/docs/python/

Also, it has a feature similar to Design By Contract like Eiffel does:

http://cobra-language.com/docs/quality/

Cobra is worth a look, and keeping eye on as it evolves, IMO.


I especially like optional static typing and compile-time protection from nil/None very much. It's type system doesn't look very sophisticated, but since you get fully dynamic variables with no (syntactic) overhead it's not that important. requires and ensures are very nice (as are tests), especially because the only other language with real contracts (Racket) focuses on functions and is a bit lacking when it comes to contracting methods with side-effects.

It generally looks quite nice for a niche language and it's nice that it seems to be maintained. It seems that there is a JVM underway, by the way. Boo was the other similar, interesting language I checked out some time ago, but it seems to be under less active development (if at all).

Anyway, if Python 3 is such a problem, then maybe let's port everything over to Cobra. It has snake in the name and familiar syntax and lots of goodies; imagine how far it could get if hundreds of thousands Python users would start working with it! (Just joking, although I'm wishing Cobra (and all other new or unpopular) languages the best of luck, of course)


Interesting points, thanks, and good to know about the JVM support.


OMF, I have been looking for Cobra's home page for years! I saw it a few years ago, and was quite impressed by it, bookmarked it and then forgot all about it. Then spent the next few years searching through Lambda the Ultimate, the Internet, etc to try and find it again. And now it's in front of me again, and more mature to boot! Whoot! :)


To be fair Anaconda is the name of a leading python distribution focusing on scientific python. Note, I work for Continuum.


Ruby did it and we're not wringing our hands about how Ruby is dead. It's just that the whiners feel particularly empowered in Python, they believe they have a chance of getting Python 3 erased.


Within the few industries I've had a peek at (not hip startups or web companies) this doesn't appear to be that big of a problem. Companies that rely on Python 2 code in the background seem to be sticking with Python 2 without any noticeable penalty or incentive to change. And places that have their core business wrapped up in python seem to be gradually making the transition OK. It's not without cost but you do what you have to do...

This sort of thing isn't uncommon... I see old perl installs behind production software, ancient versions of RedHat, Oracle, etc. and I deal with mostly technology centric companies so that's likely just the tip of the iceberg. Old software doesn't really go away.


I learned Python in 2004 (circa Python 2.3) and since developed several years mainly in Python, but somewhere along this path I lost almost all my excitment and joy for this language. I used to dream in Python, now I find it cumbersome and to me Python 3 was an unnecessary pain and I also know some people are afraid to say it explicitly because you want to look as someone who remain on top of your game by being able to cope with new things, but knowing how to deal with Python 2/3 doesn't mean it was a good move. I'm now just happy to not have to develop in Python anymore.


What did you move to?


Not the same guy, but I moved to Haskell. Not due to Python 3, but because I've realized that to get the awesome features Python has, I don't have to give up the safety and performance I thought I had to.


yadda yadda. i'm happily developing in 3.4 and the improved exceptions alone make it worth it. the pain is pretty much limited to 3rd party libraries (getting better every day, though not nearly fast enough, true) and people not understanding bytes vs text (incidentally, mostly native english speakers.)


Agreed, Python 3 is in almost all aspects better and I really prefer 3 over 2. Most of the bigger 3rd libraries are already ported, at least all the ones I care about.

Numpy seems to be the biggest issue for most, but I guess it must be in some special area, I never found a need for it doing web development.

The unicode support removes so many of our problems, although I can understand the issues strings vs. bytes that a some people run into.


Numpy works really nicely on python 3.

Numpy was ported to python3 years ago. It was one of the first libraries who rolled up their sleeves and did the work for the good of the python community. I even helped with this, so I know first hand.


Óh, you're right... I wonder what I was thinking about then... SciPy maybe, but that's been ported as well.


So what is this strings vs bytes thing all about?


Historically, when the computing world only knew about English, a character was represented by a byte. That led to a string being essentially byte array. As such, the concepts of a byte and a char and a buffer and a string were often used interchangeably (in fact, the 8-bit-type in C is called char). With Unicode and the need to parse international text, however, a character can no longer be represented as a simple byte, and things have become messy.

Python3 finally cleanly separates the concepts of a "byte" (8 bits) and a "character" (which now is an abstraction). A String is now a collection of characters and no longer functionally equivalent to an array of bytes.

This change is normally welcomed by people who have to deal with multiple languages and encodings anyway, as making the conceptual difference between a character and a byte explicit makes dealing with text much easier. However, if you were used to thinking of strings as byte arrays and not as "the data type for text", you might have a hard time.


I think Armin Ronacher has the best overview: http://lucumr.pocoo.org/2014/1/5/unicode-in-2-and-3/

But it's basicly Python three killing of the byte-string and ignoring the fact that in some cases it actually makes sense to work with string that are specifically not unicode, rather than resorting to have a bytes type instead.


I don't see how it is coherent to ask for a string that is not unicode, and not bytes. What is it?


A string encoded in some non-unicode encoding.

Not everything in the world is unicode (or unicode compatible the way, say, US-ASCII is.)


I've never really understood these claims that Python 3 only offers minor improvements over Python 2. Some of the improvements in Python 3 are quite significant: the example that comes to my mind first and foremost is that strings are now Unicode by default. These days you should really be using UTF-8 as your default encoding right across the board (the only exception being when you have to interact with legacy systems), and any language that doesn't make it so is quite frankly a deal-breaker as far as I'm concerned.


Agree. I've done a large project last year on python 2 and wish i could go back in time and start with python 3 right away. To me unicode support really was the crappiest side of python, and it's good to know that it's supported now.

The problem for me with python is more general as just python 2 vs 3. It's that python seems stuck in the purely dynamic typing side of PL whereas the trend now (golang, haskell, dart , typescript) seems to be that types are useful for many things.

But sticking with python 2 wouldn't change anything to that matter.


Python 2 has unicode literals and utf-8 available if you declare them at the top of your file. You don't need Python 3 for that.


Yes, but it's not the default and it does require special considerations. This means that code written by inexperienced developers (or even more experienced developers who aren't familiar with the ins and outs of Unicode and internationalisation etc) will be more likely to behave incorrectly when presented with anything in a foreign language.


If they have difficulty grasping those concepts, then they shouldn't be working with them. They should rather stick to plain ascii/latin-1 and leave the fancy things to people that know what they're doing. Python3 in this case wouldn't be a silver bullet to make them magically work with unicode better.


Nobody should be sticking to plain ascii/latin-1, and certainly not inexperienced developers. The only valid reason for using anything other than UTF-8 these days is that you are interacting with a legacy system that doesn't support it.

Unicode isn't a "fancy thing" that is best left to "people that know what they're doing." It's the difference between displaying a name such as Siån correctly and as a bunch of hieroglyphics.

Besides, even for experienced developers who know what they're doing, if you're imposing any form of ceremony or special considerations around using Unicode, you're increasing the risk of bugs and mistakes.

That's why I say Unicode should be made the default. It's legacy encodings such as Latin-1 that should be treated on a need-to-know basis.


This is precisely what I would consider a minor improvement.


No, defaults are important. It's the difference between "doing the right thing by default" and "doing the wrong thing by default."

Unicode may be of little relevance to some disciplines, such as scientific computing for example, but in others, such as modern web development, it is a deal breaker.


I don't need to be told that proper handling of Unicode is important. I know it's important.

I also don't need to be told that defaults are important. I know they're important.

But it is still a minor improvement under the scope of a ~6 year migration that has been absolute hell to everyone involved. Under any other circumstance, I'd be right there with you saying It's A Really Good Thing, but if we have to pay this high of a cost for it, then the improvement here looks pretty meh to me.

If you want to insist on evaluating this on some absolute scale irrespective of its cost, then I don't want anything to do with that.

Finally, one might argue that whether this is actually minor or not is irrelevant. What's relevant is that a boatload of people perceive the delta between Python 2 and Python 3 to be incredibly small, and yet, the amount of work to migrate is dauntingly large. There's a discrepancy there regardless of whether you disagree with others' valuations of the improvements in Python 3.


I think the problem of Python 3 is, that it is too close to Python 2. Learning Python for the first time is really fun, compared to other languages. There is almost always a interesting technique that does exactly what you need and usually great tutorials. But switching from Python 2 to Python 3 is quite frustrating, it is just different enough that one can not rely on intuition, but it is similar enough to lure one into using intuition. With the added frustration that all the errors would just not be there in Python 2. So assuming that I will not get payed to become a Python 3 expert, I will just write Python 2 and pick up another language at some time in the future.


Mostly, what you need to become a "Python 3 expert" is learning the Python 2 __future__ module. Use it for a while, then the switch will be seamless, intuition-wise.


Marketing told you it is hard? Or did you try it?


In my experience, the lack of support by common libraries is not nearly as bad as the author claims:

http://python3wos.appspot.com/


All my libraries run on Python 3 but I would recommend nobody to use them on that Python version if they can also use them on 2.x. They have better unicode support on 2.x and there are more 2.x users thus less bugs in 2.x specific code. Aside from that, the ecosystem for other libraries is bigger.


The point you raise regarding the impact userbase-size has on the library is very important in the Python2/3 split. It's similar to the divide that multi-architecture releases have to deal with, where less common platforms are more likely to have long-lived bugs just because the userbase is smaller and less likely to find and identify them.

If, as the author of an impressive collection of Python libraries, you had to make a rough estimate of when your Python 3 userbase will reach critical mass, such that they have a reasonable shot at finding those bugs and getting them fixed, how far into the future do you think we're looking?


Last time I was looking less than 5% of my users which was two months ago (up from 3% two years ago or so) used Python 3 at least going by download stats. But Python 3 support takes way more than 50% of my time :-/


Yes, but that's you, and you are a known anti-P3 (at least as far as string handling goes) type :-)

What do other devs that ported their stuff to P3 say?


Re: "There are other solutions, but reviving Python 2 is so obviously the correct thing to do, that other solutions are not worth mentioning."

I don't even know how to respond to such "logic".


I completely agree. Since the release of Python 3, I've attempted to get familiar with Python and I always have this dilemma.

Which version do I learn first?

On a first, ignorant, glance learning Python 3 makes sense because it's the way of the future. Although, the more you learn, you slowly figure out that they're not that different; but, they're communities and supporting libraries are in two separate camps. It's like there's a cold war mentality of people who can't or won't port/migrate to 3 and the Python 3 crowd that looks upon the former with disgust and disdain. There's clearly a huge divide and it's uncomfortable and confusing for new folks to learn and get involved.

Now we're about 6 years since the first Python 3 release and I've never gotten involved with Python beyond a few days of tinkering. I've moved on, and based on my experience, people who are newcomers to programming are walking right by without a second thought. This coupled with the developers that are jumping ship, isn't good news for Python. They need a course change, and they need to make it quickly.

Just my $0.02.


Forking Python 2 would be a terrible idea. Especially since 2.7 keeps on being supported anyway.

Python 3 is the way to go in the long run.


Even without python 3, it has become an increasingly complicated ecosystem. I run python 2.x on windows for scientific uses and find incompatibility to be a common problem. I need separate 32 bit and 64 bit versions which use different dlls that may or may not be in several possible locations and use incompatible libraries. Trying to run pypy or python 3 would be just to complicated.

Python is not a single ecosystem any more. We need separate distributions that can be self contained.


I've been told the Anaconda distribution (https://store.continuum.io/cshop/anaconda/) of Python can ease some of the pain there. The fact that I have to direct you to a "store" to download it does give me pause, however, even if the distribution is actually free.


Anaconda is open source, free and will always be. The founders of our company (and the rest of us) have a long history of being good open source citizens. There are enhanced versions of Anaconda that include proprietary code that we charge for. The base distribution (which includes Pandas, SciPy, NumPy...) has and will always be free.

Note:I work for Continuum Analytics.


The link to the "non-store" download page which we tell everyone about is: http://continuum.io/downloads.

Yes, we are a company and do sell things, but our commitment to open source is backed by many, many years of significant contributions to the community. We give as much away as we can, and sell to those who need more.

While you certainly buy more from us, Anaconda itself will always be free as long as we can continue to make it.


Thank you for that. And thank you for your contributions. My point about the store was more aesthetic. If you are ideological about the openness of your tool chain (and I start edging that way the closer things get to science), then the "exit through the gift shop" is mildly off-putting. I say mildly, because it still doesn't prevent me from recommending it over other alternatives. I also understand that your paid products add value (and in the case of MKL, aren't yours to open-sourcify anyway).

The non-store link definitely feels more open-sourcey. I'll be sure to send people that way. Again, I didn't want to imply that you all were bad open source stewards.


Isn't this what virtualenvs are for?


The tradition on Windows is to download binaries and also not to maintain binary compatibility across compiler versions, so it can be a pain to get matching compiler versions of libraries.


The part about 32-bit versus 64-bit is a red herring; you'll deal with that on Windows regardless of what runtime environment you use on top.


This seems a bit overdramatic. Yes, the attempt to do a version number hijack by a influential set of developers is annoying but there is no reason that it can't be just ignored. Python 2.7 doesn't need new features. It is still wildly useful without them. It is perhaps even more useful without them...

You can't deprecate a computer language. Such a thing exists independently of implementations.


I feel the pain :(

The problem is that there is a misconception that python libraries needs to be "maintained" and if they are not maintained they should not be used. However, many long-tail libraries really do not need to be "maintained": the code is well tested, works, and it has no security implications. Some of these libraries are just clever algorithms or re-engineering of some obscure protocol or format.

I guess this is one more reason that java is still the safest choice :( Sad.


Decent read - it further helped me appreciate the position from which Python 2 devs require certain non-trivial, non-ported libs are coming.

That said, as a mostly Python 3 developer, PEP 404, to which he links, provides much insight and justification for ending at 2.7. In brief, the changes all seem to me to be efforts to make Python more 'Pythonic' (i.e. TOOWTDI).

PEP 404 2 -> 3 reasoning: http://legacy.python.org/dev/peps/pep-0404/#and-now-for-some...

TOOWTDI: https://wiki.python.org/moin/TOOWTDI


I'm curious what the lesson here is. A lot of the changes in Python 3, such as the byte format of strings, seem to be a good thing--just something that breaks compatibility.

Is the lesson never to bother fixing a major flaw in your software/language after it has been widely used/available?


There are two precedents to look at: Java and Perl.

Perl grew from merely useful to your-favorite-whiz-bang by being a decent language and growing a large collection of third-party modules. Then Larry Wall decided Perl5 wasn't fun anymore (also because of wart accumulation) and declared Perl6 to be The Future(tm). Years later, we have some remaining stuff that's running in Perl5, and also some enthusiasts that use Rakudo and Parrot in an attempt to reach the Shangri-La of Perl 6. But, in many important respects, Perl is an ex-parrot and Parrot won't change that all that much.

Now for Java. Java started out as a slow and ugly cousin of C++ that ran alot slower and gave you the worst of having a bad compiler and a stupid bytecode interpreter. As Java gained traction, Sun and others (e.g. IBM, Apache) proceeded to fix all the warts in a backward-compatible way. Java 1.2 finally had decent collections. Java 1.5 adopted Philip Wadler's proposal for generics through type erasure. Newer Java releases added some sorely-missing utility functions that you'd take for granted in other languages, such as converting between an array and an ArrayList, or splitting a string.

Somewhere in-between, we got servlets, JSP, JDBC, Hibernate, JAXB, a fast compiler. It's not so bad, even though the language still bugs me sometimes.

What's in store for Python? Similarly to Perl, and differently to Java, Python2 has byte-strings, and just when people where coming to grips that there are some things for which str is a good fit and others where you want to use unicode, you got the "unicode as default" in Python3. A conservative approach would be to extend the language and libraries so that it becomes natural to use unicode objects in the places where you're dealing with text, and keep the functionality that bytestrings have around so that the old code is still working. Python3 did not choose the conservative approach - indeed the goal was to do all the compatibility-breaking things that were needed, upfront.

Despite that fact, there is hope that Python will not undergo the Parrot-to-ex-Parrot transition that Perl underwent. The Python core is already there, and many important libraries - including the numpy/Cython etc. toolsuite, and different web frameworks, are actively trying to find the remaining issues.

So, as a summary: A lot of the changes in Python 3, such as renaming the old strings into bytestrings and removing a lot of their functionality, are well-intentioned but have considerable problems. It definitely gets harder to fix major flaws in a language that is widely used, and blackmailing the users by declaring the "not as fun" old version of your language dead should not be done lightly if you don't want to become the next Perl.


Perl is an ex-parrot and Parrot won't change that all that much.

It's worse than that. Rakudo developers declared Parrot deprecated (in favor of a new VM which had yet to be written). That, of course, drove off all of the Parrot developers--see the decline in commits in March 2011:

https://www.ohloh.net/p/parrot/commits/summary

If Rakudo ever reaches a point of stable equilibrium where it's reasonable to think about creating libraries for it, P6 still faces an adoption cycle that's more difficult than that of Python 3. There's no plan for P6 to replace Perl (partly to avoid that adoption pain, for one, and partly because P6 has taken so long, there wouldn't be any Perl users remaining unless P6 had publicly changed its strategy).

Python's transition hasn't been quick or easy, but it's gone much, much better than the P6 fiasco.


I won't comment on Parrot, because I know nothing of Parrot or Perl 6 internals. But from what I hear and read on the internet, the new VM, Moar VM is doing pretty good as a parrot replacement.

P6 still faces an adoption cycle that's more difficult than that of Python 3.

I don't see how, especially when p5 developers are saying they will continue to support p5 long after p6 is released. And p6 is such a far drift from p5 that you can almost call it a new language. And no one if their right frame of mind will re write all p5 software in p6.

So it will be p6 for any new projects and p5 supported all along.

There's no plan for P6 to replace Perl (partly to avoid that adoption pain, for one, and partly because P6 has taken so long, there wouldn't be any Perl users remaining unless P6 had publicly changed its strategy).

From what I read(Again, I have no personal contributions or sources and I rely on what I read or hear on the internet) in http://jnthn.net/papers/2014-fosdem-perl6-today.pdf [pdf] rakudo is very close to spec completeness. And bulk of the effort currently is implementing last bits of spec and improving the speed.

Python's transition hasn't been quick or easy, but it's gone much, much better than the P6 fiasco.

Python is hardly comparable with Perl 6. They just changed a print statement and iterators in Python 3. Perl 6 is a very different story completely altogether.


TL;DR Imo, for at least the next year, the P6 project needs to encourage contribution by folk who enjoy creating an entire system and discourage use by folk who just want to get stuff done.

----

I've loosely followed P6 since 2000; daily read the log of the central hub of P6 activity -- the IRC channel #perl6 -- for the last 3 years; and summarized it daily for the last 2 years.

I am concerned about folk getting the wrong impression about Rakudo's readiness for general use by mainstream programmers that just want to get stuff done.

I started using the following disclaimer a year ago and I think it still sets expectations appropriately:

Perl 6 is not remotely as usable and useful as Perl 5; it has dozens of users, not millions; it is 100-1000x slower than Perl 5 for a lot of stuff; the P6 documentation is immature and incomplete; the spec has not reached 6.0.0; the Rakudo compiler has not fully implemented what's already in the spec; most of the concurrency and parallel implementation has only just begun; P6 can not currently use CPAN modules; Perl 6 has syntax and semantics that are not backwards compatible with Perl 5; Perl 6 culture is -Ofun which some think is incompatible with getting things done; some folk think that Perl 6 code looks like line noise... In summary, there are infinitely many things wrong with P6.

----

> From what I read ... rakudo is very close to spec completeness ... implementing last bits of spec and improving the speed

Ignore implementation (Rakudo) for a moment.

The spec isn't complete.

There's very recent talk of producing "a list of big ticket items remaining" for "hitting 6.0" and "a tri-color marking of the spec" showing "Implemented in Rakudo, not implemented but we agree we really should for 6.0, deferred to 6.*".


I think only some one far detached from reality would expect, a Perl 5 like ecosystem ready on the day of Perl 6.0.0 release. And most people don't need many Perl 6 esoteric features either. So I guess your "a list of big ticket items remaining" hopefully won't be "a big list of ticket items remaining".

Besides I was of an opinion even after the 6.0.0 release the spec won't be complete. Spec will never be complete. Spec is a backwards compatible changing piece of document which the implementation tries to catch up with. The stable releases of the implementation are production releases.

Lastly you could just say. Esoteric features a, b, c, d are left for next 6.* releases. While release a 6.0.0 release is cut after with whatever you decide from that list earlier. The world is in no hurry to use the esoteric features anyway.

Either way, I wish you guys all the very best. And hope to soon see a closure soon on this.


> I think only some one far detached from reality would expect, a Perl 5 like ecosystem ready on the day of Perl 6.0.0 release.

I think it's still plausible that someone (it'll have to be one or more serious P5 guts hackers) will soon start implementing something like diakopter's plan for P5+XS interop. (That's the plan for having P6 code calling P5 code and back again, including XS modules on CPAN.) If that works out well, a 6.0 release could be viewed as including all of the P5 ecosystem or, conversely, as an experimental part of P5.

But perhaps I'm far detached from reality.

I certainly wouldn't expect the project to wait for such interop. If 6.0 has to ship without support for P5+XS modules, so be it.

> So I guess your "a list of big ticket items remaining" hopefully won't be "a big list of ticket items remaining". > Besides I was of an opinion even after the 6.0.0 release the spec won't be complete.

It sounds like the plan is to mark particular bits (paragraphs?) as either 6.0 or post 6.0 and associated spectests will determine whether or not Rakudo is compliant.

For more of the discussion see http://irclog.perlgeek.de/perl6/2014-05-23#i_8767026

> Esoteric features a, b, c, d are left for next 6.* releases.

I don't know about the esoteric aspect being the decider, but yes, I'm assuming the basic idea will be to defer as much as is reasonable.

> Either way, I wish you guys all the very best. And hope to soon see a closure soon on this.

    { "this" }
Maybe not what you're after?

Seriously, while I hope for a 6.0 spec this year I think there's about zero chance of a 6.0 Rakudo this year. And unless a few dozen more contributors turn up maybe not in 2015 either.


Thanks, for the link.

But you are talking of another 2+ years. That's a lot of time in software.


Sure.

I wish it could be quicker and I hope it will.

In the meantime I contribute as best I can.


chromatic is hardly a simple bystander in this issue. He's been very public in his feelings towards P6 and how it's dealt with Parrot, from back when he was very involved with Parrot. I assume he thinks he's giving valid advice as to the whole P6/Parrot thing, but the visible vitriol leads me to believe there's definitely some emotional underpinnings as to his opinions.

Which is a shame, he's one of the most insightful people I've ever followed (I miss his more frequent blogging at modernperlbooks.com, which I've followed since it's inception).


Unless the Singularity has occurred and no one told me, everyone has emotions. I prefer, however, to discuss facts backed up by primary sources, such as the commit stats linked earlier or mailing lists or IRC logs. Do you have an opinion about the situation based on those?


Of course. I'm not making some pie-in-the-sky assertion that arguments should be devoid of emotion, just that I had noticed what appeared enough emotion to overrule civil discourse in the past.

That said, after re-reading your original comment, I don't see much to disagree on (except for one thing, which I'll cover later), and it doesn't seem to exhibit many of the problems I was alluding to, so you have my apologies. It was inappropriate of me to bring that up in this context.

What I should have commented on was that Parrot was all but dead long before MoarVM put the last nail in the coffin. I doubt the announcement did much beyond cement what most developers already knew - Parot was not well suited for it's original purpose, whether you view that as a VM for P6 or a general purpose VM, specifically because for too long it had tried to follow both paths to the detriment of all.

My impression of Parrot, as an outsider who consumed quite a bit of the public information available about parrot development, ended up being that Parrot was too experimental of a VM, and never moved beyond the stage of being a fun sandbox for developers to come experiment in. When P6 was still coalescing around how to implement what was in the spec (and revising the spec in light of that), this didn't matter much. As P6 increasingly moved towards an actual implementation, Parrot's misalignment with what P6 needed along with Parrot's deprecation policy caused many problems between the projects.

I think at one point, prior to MoarVM being announced, there was still a chance Parrot could have been "saved". But by saved, I mean continue on as a sandbox for developers to experiment in, possibly without ever culminating in a VM that was useful as primary target for a language. If enough developers would have been willing to continue with this truth exposed (and I'm not sure enough would), then if Parrot and made the choice to continue as a "general purpose VM" instead of primarily a P6 target, it may have survived as a sandbox.

I'm interested in your opinion on this interpretation - not because I think any of it is news to you, I'm sure it's not - but the opposite, in fact. This view is in no small part informed by your own writings on the subject over the years.

P.S. I was completely serious before when I said I missed your more frequent posting on modernperlbooks.com. I hope that no matter what the future holds for you, that it also includes you writing about technical issues in blog or book form (and enjoying doing so!). I know you've helped solidify my thinking on a great many things over the years, to my great benefit.


I think at one point, prior to MoarVM being announced, there was still a chance Parrot could have been "saved".

The deprecation policy was a point of contention, sure. It was poorly implemented in practice--but it was as much to help Rakudo as it was to reduce the amount of churn in Parrot. Rakudo probably would have had less trouble with it if Rakudo had less code which poked into the internals of Parrot. That's half the fault of Parrot, which had poor design of certain pieces, as well as Rakudo, which rejected a lot of offers to pull some or most of that code into Parrot. (That's a theme.)

I personally spent most of my time on Parrot doing things to improve it for Rakudo--fixing a lot of unpleasant bugs, both in Parrot and Rakudo, for example as well as improving performance. I and other Parrot developers asked multiple times for a list of issues Rakudo wanted Parrot to address. Occasionally we'd get a list and work on them--but I also volunteered to focus on specific improvements for Rakudo and was told "No" or "Not yet" multiple times. You can see that too, if you look in the IRC logs--and not just for things I wanted to do. Andrew Whitworth, in particular, spent months or years on tenterhooks, volunteering to implement the object system called "sixmodel" in Parrot, but he was always told that it wasn't fully designed yet or it wasn't mature enough. Eventually he left the project too.

On one side, I took a lot of criticism from Rakudo developers for not pushing Parrot harder to be what they wanted and on the other side, I took a lot of criticism from Rakudo developers for trying to change Parrot to be what they wanted. In the face of that seeming contradiction, it seemed obvious to me that they'd already decided to write and maintain a new backend VM from scratch.

My priority--getting a usable P6 release out and stable for general purposes--was clearly incompatible with that decision. Rakudo Star was supposed to come out as a "useful and usable" distribution for early adopters over four years ago, and to my knowledge, it's still only nominally useful and usable for very few purposes.

That is one of my most substantive criticisms of the whole process. Rakudo and P6 have a long history of claiming to be about eighteen months away from general usability, but the project's history is littered with rewrites, questionable technical decisions, and many, many good people leaving in frustration--sometimes quiet and sometimes not.

I begrudge no one working on it, but I believe it's on a path to ever less relevance without a dramatic change in project management. I'll reluctantly take the blame for mistakes I made as a member of the P6 design team and as a leader in Parrot, but I object to rewriting history to suggest that Parrot was solely at fault for Rakudo's current state.


I understand you were in what seems an untenable position between Rakudo and Parrot. I'm not trying to further the "Parrot caused all Rakudo's problems" argument, just my own which is that Parrot and Rakudo were destined to either split farther or merge at some point. It's unfortunate that Parrot started towards what appeared to be a merge at just about the same time Rakudo decided a deeper split was the only way forward.

Personally, I've always viewed MoarVM as a spiritual successor to Parrot. Parrot and Rakudo begat NQP, and NQP has survived parrot. It's a shame more parrot devs didn't jump to MoarVM, but they of course had their own motivations for working on Parrot in the first place, which may not align well with being "the Perl 6 VM" (among any number of other reasons), which MoarVM exemplifies more than Parrot had in many years.


It's a shame more parrot devs didn't jump to MoarVM

Why would they? Rakudo's position was "Parrot is fundamentally broken and Rakudo is actively moving away from it." You can see how Parrot's development all but stopped within a few weeks of that discussion from the Ohloh link I posted earlier. More than that, Moar's development began in secret around that time, so how would Parrot developers know that they should have been working on that instead?

(Rakudo developers often claim that Parrot had no singular focus on Rakudo, but that graph argues differently to me. Then again, Rakudo developers also claim that they decided against using Parrot because Parrot development ceased, but that's an obvious post hoc ergo propter hoc argument to anyone who looks at the chronology.)

Personally, I've always viewed MoarVM as a spiritual successor to Parrot.

I looked at the code briefly after its announcement. It seems to repeat most of the long-standing architectural flaws of Parrot and it ignored much of the design work that we had been doing to make a Parrot which could compete favorably with fast VMs such as LuaJIT or v8. (My impression then was that, if Rakudo hadn't chased away Parrot developers and Parrot were free to ignore pesky things like backwards compatibility, deprecation, and users who wanted it to continue to work, Moar looked a lot like Parrot would have after the same time period. It was pretty disappointing.)


> Moar's development began in secret around that time

MoarVM's first commit was in April 2012, a year after Parrot commits fell of a cliff.

https://github.com/MoarVM/MoarVM/commit/51481efadbf5bbebfc20...

> Rakudo developers also claim that they decided against using Parrot because Parrot development ceased, but that's an obvious post hoc ergo propter hoc argument to anyone who looks at the chronology.

Maybe look at the chronology again?

> {MoarVM} seems to repeat most of the long-standing architectural flaws of Parrot

Such as?

> (Moar looked a lot like Parrot would have after the same time period. It was pretty disappointing.)

As you now hopefully realize, the time period was 14 months, not 29.


MoarVM's first commit

You mean the first commit committed to a repository which survived long enough to be made public, eventually. Even so, if you look at the timestamps on those first commits, you realize that either the design of the VM leaped from someone's head fully-formed like a virtual Athena, or (as was widely known at the time) that someone had been designing and playing with ideas for much longer.

Maybe look at the chronology again?

The one where Rakudo developers told Parrot hackers not to implement sixmodel to replace Parrot's default object system (the one which one of those Rakudo developers had, in fact, actually implemented) during the period where it was obvious that those Rakudo developers were in fact designing their own VM?

Who am I to believe, you or my own lying eyes?


The one where Rakudo developers told Parrot hackers not to implement sixmodel to replace Parrot's default object system (the one which one of those Rakudo developers had, in fact, actually implemented) during the period where it was obvious that those Rakudo developers were in fact designing their own VM?

Hmm, I read that as Rakudo devs trying to keep Parrot devs from starting a project that they may not want to continue in the future (given your statement it was obvious they were looking into developing their own VM). If the decision has been made, trying to reduce fallout seems wise to me, given also that they weren't public on the new direction, which I don't know the details of why.


I should have been more clear. The message wasn't "No, not ever." It was consistently "No, not yet."


I'm not sure that changes the equation from where I'm standing. When you want to keep someone from wasting time but can't/won't explain why, using "hold off"until you can finally explain seem the logical choice since "don't do that" gives away too much.

Again, I'm not addressing the given that information needs to be kept hidden, I don't know the reasoning for that. It just seems odd to me that this gets singled out as a negative when I see it as trying to make the best of a bad situation.

I'm only following up on this because it seems an apt example of the emotionally driven arguments I was referencing before. I see at least several possibilities. 1) There is information I'm don't know or am not inferring correctly, 2) we have some mismatch in values that is causing us to interpret the same situation differently, 3) your emotional context causes you to interpret the situation differently (non-rationally), or finally, I have to admit it's possible 4) my emotional context is causing me to interpret the situation non-rationally.

The outside appearance to me, as stated previously, is #3. That of course in no way makes it the most likely answer. (Sorry if you find this boring, I find it interesting as it mixes two interests of mine).


I'll try to be as clear as possible, at the risk of sounding like a humorless pendant.

When Rakudo announced it wanted to rewrite NQP to run on multiple backends, the stated justification for not relying on Parrot in the long term was twofold. First, because Parrot developers didn't treat Rakudo as the most important hosted language. Second, because Parrot didn't provide the features Rakudo wanted--in particular, its object model was unusable by Rakudo.

Those reasons are tied together; the second was used as proof of the former. You can also throw in the deprecation policy as a supporting reason, but that makes things more complicated because Rakudo developers wanted it in place for some things ("you can't change things in Parrot and break our code") and wanted it gone for other things ("why can't you just fix this thing?").

I have a problem with both reasons #1 and #2, because I have multiple examples of Parrot developers offering to do make changes to help Rakudo. In my case, I was told not to do them. In the case of sixmodel, the stated impetus behind #2, Andrew and others (who had been accused of not wanting to help Rakudo) were continually volunteering to write the very code that Rakudo developers said they wanted and were continually told "No, not yet." (See Raiph's links, for a few of many examples. See the #parrot and #parrotsketch logs for many more.)

Meanwhile, Rakudo developers were continually complaining how Parrot developers were not interested in helping Rakudo and how Parrot was technically a bad fit, citing sixmodel as an example.

I interpret that response as something other than good faith. I believe that's why Parrot developers left; there's no point to sticking around in that situation.


Thanks, that does explain a lot of your reasoning more clearly. I'm not sure I can change my assessment of the situation though, as the relative timings of these events, which I think matter greatly, aren't known to me, and the Rakudo interpretation of these circumstances isn't known to me either. I believe the Rakudo response can be explained by their belief that the special relationship between projects was unsalvageable, and the was forward was no longer in Parrot. In that case I try to fall back on my default, which is to assume people will act in their best interests while also trying to minimize damage and discomfort to others unless given cause.

Whether the actions of the Rakudo developers indeed did cause a major Parrot exodus is something I'll easily cede at this point. Whether that was intended, unintended or the opposite of what was intended seems to be what we are really talking about here (because I believe intention can matter, not in assignment of blame, but as possible mitigation of scale).

The irony of this is that Parrot seems to now well and truly be a P6 focused VM, as the only work done on it (besides rurban and util) is by Rakudo devs to fix bugs or make small changes to new Rakudo stuff works. Why rurban still bothers is beyond me, AIUI he's still got p2 to work on.

In any case, you've been more than a good sport in humoring me in all this; you put up with more from me than I would have imagined. Thanks!


Whether that was intended, unintended or the opposite of what was intended seems to be what we are really talking about here.

Let's assume the Rakudo developers acted in good faith per your reasonable default. The effect is still that Parrot is all but abandoned, multiple productive developers with decades of practical experience attempting to implement P6 no longer contribute to either Parrot or Rakudo, and Rakudo isn't obviously more usable than it was three years ago. The effective delivery date of P6 is still "some time in the future". The fourth anniversary of Rakudo Star is approaching and it hasn't met its goals.

Even if Rakudo were released in a stable, 6.0 form today, I still wouldn't use it for anything I care about because I don't trust its developers to deliver usable software reliably.


Rakudo isn't obviously more usable than it was three years ago

I don't believe that to be true. Even if by usable you mean primarily can be used in production, I would still argue it's more usable that it was, even if I wouldn't label it strictly "usable." I wouldn't shell out to bc for math ops, for any number of reasons (including it being slower for my purposes), but I wouldn't call the bc utility unusable.

I don't trust its developers to deliver usable software reliably

I can't fault you for that conclusion, given your experiences. That said, developers involved in a project change over time, as does the influence of existing developers (and maybe you believe people can change, too). That conclusion may become more or less true over time, and indeed may have already changed. Perhaps some day you'll find a reason to be interested again, and find the experience more enjoyable. We can only hope. :)


TL;DR Imo specific evidence directly contradicts chromatic's assertions about the intentions, plans and actions of the Rakudo dev leads (Patrick and jnthn). My assessment, and chromatic's too I think, is there was a growing breakdown in trust that blew up in 2011 with serious consequences for both Rakudo and Parrot.

> When Rakudo announced it wanted to rewrite NQP to run on multiple backends

What announcement are you speaking of? In the past you seem to have suggested they announced this in late 2010 or early 2011. Available public evidence, including a github repo, makes it clear that one of the primary objectives of the 2009 NQP rewrite was to transition to a multiple backend architecture, and there was no attempt to hide this.[1]

There were discussions in early 2011 about the next stage in the multi backend evolution, and about elements of Rakudo's latest NQP becoming a central part of Parrot, and this did seem to get derailed by misunderstandings about NQP's support for multiple backends. Maybe that's what you're talking about?

> First, because Parrot developers didn't treat Rakudo as the most important hosted language.

Patrick's stated justification for a multiple backend Rakudo is explicitly grounded in Larry's 2001 specification that Perl 6 must run on existing major VMs[2] and Patrick's decision when he became the Perl 6 project manager in June 2009 to prepare Rakudo to run on JVM and CLR.[1]

That said, by the time Rakudo moved out of the Parrot repo in early 2009 there were definitely major problems with the way Parrot and Rakudo dev meshed (or not)[3] so it might have been good if Parrot had treated Rakudo as an important hosted language, or at least had better responded to what Rakudo actually asked for, according to the mailing lists and IRC logs from 2009 thru 2011, namely for there to be better management of breakage.[4]

> Second, because Parrot didn't provide the features Rakudo wanted

Again, aiui, this wasn't a stated justification for Rakudo considering multiple backends before 2009 nor was it when Patrick rewrote NQP with support for multiple backends in 2009.

But, yes, the growing gap between what Rakudo was saying it needed from Parrot and what Parrot actually provided must surely have led Patrick and jnthn to ponder a plan B, and spending a few years writing another VM would have been a natural thing to consider in the midst of the deteriorating Parrot situation in 2011 despite the very scary prospect of yet another huge delay in getting P6 to 6.0.

> I interpret {Rakudo's} response as something other than good faith. I believe that's why Parrot developers left; there's no point to sticking around in that situation.

There were indeed clear signs of a severe breakdown in trust littered throughout 2011.

I think jnthn and especially Patrick really tried to solve this problem but were fighting a losing battle. I acknowledge my bias (I like Patrick and jnthn) but there are tons of examples such as Patrick drafting a formal Parrot/Rakudo relationship policy in mid 2011:

http://lists.parrot.org/pipermail/parrot-dev/2011-July/00598...

(Note that there was no reply.)

I agree there would be no point in folk continuing to cooperate if there's an assumption of bad faith. Typically if person/group A interprets person/group B's actions as being done in bad faith, person/group A themselves starts acting in bad faith, or at least in a non-cooperative manner. If this attitude spreads throughout a group then all is indeed lost.

----

[1] The two Rakudo devs who matter in this context are Patrick and jnthn.

Patrick Michaud's 2013 "Perl on JVM" talk made it clear that one of the goals of his 2009 NQP rewrite was to restructure it to support multiple backends corresponding to existing major VMs (with JVM and CLR specifically in mind).

https://www.youtube.com/watch?v=XgPh5Li3k4g

Googling for rakudo multiple backends with a date filter and then doing a quick archive.org dig shows that sometime in 2009 jnthn changed the "What do I do in Perl 6?" section of his about page to include "Transforming Rakudo from a single backend compiler to one capable of targetting multiple backends".

http://web.archive.org/web/20091218092004/http://jnthn.net/p...

[2] "Perl 6 must not be limited to running only on platforms that can be programmed in C. It must be able to run in other kinds of virtual machines, such as those supported by Java and C#." from http://www.perl.com/pub/2001/04/02/wall.html

[3] http://www.nntp.perl.org/group/perl.perl6.compiler/2009/01/m...

[4] http://comments.gmane.org/gmane.comp.compilers.parrot.devel/...

http://lists.parrot.org/pipermail/parrot-dev/2010-December/0...

https://groups.google.com/forum/#!topic/parrot-dev/nlgcpd2Xb...

http://irclog.perlgeek.de/parrotsketch/2011-09-06#i_4382083

http://lists.parrot.org/pipermail/parrot-dev/2011-September/...


My assessment, and chromatic's too I think

Again, please leave me out of your uninformed and biased speculations. You weren't there. You weren't involved. I was--not only as a developer on both projects but as the P6 project secretary. In that capacity, I took and published hundreds of pages of notes, and I'm more than capable of speaking for myself.


Imo you are being rude.

I noted what I thought was a parallel between my assessment ("breakdown of trust") and yours ("bad faith"). You may not see the connection and I may be wrong but my comment was for all readers, not just you, and was qualified with "I think".

My assessment was not uninformed. As you've said yourself, it's all there online and I've thoroughly researched the relevant period.

The key exception is that I have been unable to find your published P6 call summary notes. I would appreciate a link to them.


> if you look at the timestamps on those first commits

jnthn churns out great design off the cuff and generates rapid sequences of brilliant and clear commits. I've been following #perl6 for a couple years so I've gotten used to it. (I wasn't surprised when I recently found out he has a first class honors degree in cs from cambridge.)

That said, confirmation bias is an ever present danger so I still investigated a lot more than just the timestamps on the first commits before I posted my prior comment.

Here's the graph of contribution over time: https://github.com/MoarVM/MoarVM/graphs/contributors

Imo the pacing and content of the repo's early commits (all jnthn) are typical for jnthn.

> Rakudo developers told Parrot hackers not to implement sixmodel

The period you're talking about is early 2011 thru early 2012.

The only two Rakudo devs that are relevant in this context are jnthn and Patrick.

If jnthn was telling Parrot devs not to implement 6model throughout this period, why, on May 31st 2011, did he write several 6model docs in response to a request by lucian? Why didn't he just tell lucian not to implement 6model?

http://irclog.perlgeek.de/parrot/2011-05-30#i_3827733 http://irclog.perlgeek.de/parrot/2011-05-31#i_3833670

If Patrick was telling Parrot devs not to implement 6model, why, on July 6th, 2011, did he suggest a two month delay of doing so with whiteknight? Why didn't he just tell whiteknight not to implement 6model?

http://irclog.perlgeek.de/parrot/2011-07-06#i_4072930

On february 2012, two months before jnthn started the extant MoarVM repo, not_gerd wanted to know some stuff so he could implement 6model. Again, why didn't jnthn just tell him not to implement 6model?

http://irclog.perlgeek.de/perl6/2012-02-06#i_5109668

> during the period where it was obvious that those Rakudo developers were in fact {developing} their own VM

If it was obvious that Rakudo devs were developing their own VM in 2011, why did nobody mention it in 2011?


Why would they?

I wasn't trying to imply any sort of wrong decision on the part of Parrot developers for not working on MoarVM. I tried to make sure that was obvious. Perhaps I failed.

More than that, Moar's development began in secret around that time

I'm aware, which is one of the things I was alluding to when I mentioned "among other reasons" for not working on MoarVM. Beyond not knowing it existed, when they did there was probably a feeling of betrayal. I'm not trying to ignore that. The shame is that some of these people would most likely have enjoyed working on MoarVM if things had played out differently.

My impression then was that, if Rakudo hadn't chased away Parrot developers and Parrot were free to ignore pesky things like backwards compatibility, deprecation, and users who wanted it to continue to work, Moar looked a lot like Parrot would have after the same time period. It was pretty disappointing.

That actually sounds like praise to me (barring some of the architectural missteps you allude to earlier in your statement). That MoarVM was able to mostly get to where you think Parrot could have been if it had been able to ignore a lot of extremely hoary and cumbersome problems seems like a good thing. Did you find it disappointing because of MoarVM specifically, or because of the realized loss of potential from Parrot?

In the end, if the Parrot split solved some P6 roadblocks, and spurred more rapid P6 development, which I believe it did, I have to count the action as a wide choice. I found Parrot interesting in it's goals, but it's target of being a VM for scripting languages was starting to get competition from the JVM, which I'm not sure it could compete with. P6, on the other hand, still hasn't seem a competitor feature-wise IMHO to make me think it's future is in doubt from an outside source (whether it will ever be successful, however you define it, is a different question). Of course your level of investment and interest in each of the two projects may cause you to weigh the situation entirely differently.


Did you find it disappointing because of MoarVM specifically, or because of the realized loss of potential from Parrot?

Several reasons, in no particular order.

One, because of the deliberate driving off of Parrot developers and their knowledge about what works and what doesn't work for building a VM for a Perl. (Sure, you can believe that I feel a little personal betrayal there, but there's also a concern that the Rakudo developers were taking on yet another project for which the bus number is, as usual, abysmally low.)

Two, because the resulting design (when I looked at it) was adequate, at best. I believe it won't compete with fast VMs without a serious change in its architecture, and no amount of magical thinking about how a GSoC student will build a JIT in 10 weeks will fix that.

Three, because it represents yet another example of NIH thinking and throwing out working code in favor of spending even more time building something new from scratch.

Four, because it set back the P6 release date by at least 18 months, if not years.

Five, because all it represented when I looked at it was the kind of code shuffling that Parrot could easily have done in the past three years, without doing anything much else (such as make the architecture improvements we'd begun to design and prototype).

One fact often forgotten in all of the nonsense about how "Parrot lost its focus and thus its purpose" is that Parrot was, from the start, intended to run at least both Perl and P6 simultaneously in the same process without embedding libperl.so.


> Larry Wall decided ... declared Perl6 to be The Future(tm).

Aiui it was a group of leading P5 devs that made the decision in 2000 to create a backwards incompatible P6, not (just) Larry.

Similarly for Python, Python 3, and GvR.

> Larry decided ... Perl5 wasn't fun anymore

Neither Larry nor GvR has said that about their earlier creations.

Larry did say "We intend to abandon the Perl 5 porter's model of development, which demonstrably leads to a lot of talk but little action." but that's the closest he got to saying Perl 5 wasn't fun anymore. (And P5 regained steam without abandoning the porter's model.)

When announcing P6 Larry said, of P5:

"We all like Perl 5 a lot. We all use it a lot. Many commercial interests will guarantee that Perl 5 continues to be well-maintained ... five years from now a lot of people will still be using Perl 5".

Unfortunately folk jumped to conclusions so this apparently measured statement was still in effect an Osborne One announcement.

> some remaining stuff that's running in Perl5

That's like saying there's some remaining stuff that's running in Python 2.x. True, but grossly misleading.

> some enthusiasts that use Rakudo ... in an attempt to reach the Shangri-La of Perl 6.

The attempt is to produce a compelling option for many future programmers. P6ers may eventually arrive and get stuck at Shangri-La but they clearly hope or think they won't.

> and Parrot

Perl is not tied to Parrot.

Rakudo (or rather the underlying NQP toolchain) is no longer constrained by Parrot because it has its own new dedicated VM. See http://moarvm.org

> blackmailing the users by declaring the "not as fun" old version of your language dead should not be done lightly if you don't want to become the next Perl.

I don't agree that GvR is blackmailing Python users.

But the issue isn't how things actually are. The issue is how things are perceived. If it seems that GvR is blackmailing users, and especially if a bunch of folk are saying as much, there's a problem.


Yes, exactly. The bugs have now become features that people depend on, and thou shalt never fix them. It's a horrible lesson, if you ask me.


I think the lesson is that if you're going to break backwards compatibility, you better offer something really enticing in exchange. Python 2 wasn't so broken that people where clamoring to get off it, and Python 3 wasn't so much of an improvement that people are clamoring to hop on it.


The lesson I see is that you should always implement a compatibility layer for using the old libs.


I remember, when Py3 first came out, everything was incompatible -- unnecessary incompatibilities like the u" notation for Unicode string literals that was dropped. Unnecessary incompatibilities in the C-extension-module implementation layer. And so on. The list of incompatibilities was just huge.

Later several of them where dropped, like the string literal trouble ... But than the trouble was already done. Many extension modules where not lifted to the new version, since the overhead was to big.

I think, many more projects would have adopted Py3, if more extension modules would support it.

The huge library of extension modules was always the strength of Python. Now we have many projects still running on Py2, because Py3 did ignore this strength.


The author didn't really give any concrete examples. What kinds of major Python libraries still need to be ported over to Python3?


The author is talking about long tail libraries, not major ones. The thousands of libs on PyPi that are still on 2.x


Except most of the more obscure libraries I find are either abandoned projects or were written to support the author's needs and don't fit the generic use case.

To note, that's not a Python-specific issue: it's the nature of any language ecosystem where modules can be made by anyone, and I'm not even sure it's a bad thing: I've learned a lot, even from libraries that I didn't end up using for whatever reason.

I think it's dangerous to imagine Python or any language as something where all the libraries you need always exist, and thus any time you have to write your own libraries it's a failure. Embrace the library ecosystem: you're going to eventually find something that isn't covered, which is when you write a library and participate.

Disclaimer: I know open sourcing code isn't always possible, and that time/project constraints can impact how much backend code a developer can write. But if we dismiss Python 3 just because some of the more obscure libraries aren't ported, I wonder how we ever picked up Python 2 originally.


I am not dismissing Python3 outright and neither is the OP. But rewriting millions of lines of code to satisfy an artificially imposed requirement is probably not a good idea even if the libraries are "abandoned"

Python3 ranks high on the list of engineering failures. Python3 itself is fine, Python3 the deployment and migration is a disaster.


The abandoned libraries I find end up needed to be mostly rewritten, even if I were to use them on Python 2.


I think you over-estimate exactly how many lines of code it takes to port from 2 to 3. It's not that much, unless you plan on using all the new features of 3.


Maybe there are thousands of unported stuff but frankly, how much of that is useful/ready for production use (production=paid software with bug translating to losses) ? I'm perfectly fine with Pyside, SQLAlchemy, ReportLab and a bunch of few other things. I don't need the n-th lib with half baked, unicode broken, poor error handling, non portable, Disney-tested stuff. So, if only 10% of the libs have moved and those are the production quality stuff, then fine. What a useless article.


Have to agree here.

Most of the libraries I'd actually trust for more than a one off script are already ported or have plans for porting.

I wouldn't trust a lot of the unported libraries in a serious Python 2 project, anyway.


Python (2) is heavily used in the finance industry, a industry that still uses cobol and fortran, they will never move on to python3


> they will never move on to python3

Sure they'll move on to python3, in the same way they moved on to C++, Scala, and VHDL.

Perhaps they will never abandon Python 2, like they haven't abandoned Cobol and Fortran. However, that doesn't mean they won't move on to Python 3.


Sounds like Perl 6, all busy stifling Perl 5. (Which has sort of shaken itself out of somnambulance with a similar technique.)


Python 3 is doing a lot better than Perl 6.


Even if Python 3 emerges as this great thing, it could still stunt the Python ecosystem. Python (the thing) is distinct from Python programming, there are plenty of amazing languages that no one uses.


Rebol is doing better than Perl 6.

Python 3 occasionally prevents work getting done in Python 2. That's it.


Perl 6 isn't stifling Perl 5, at least not anymore. Since Perl 5.10, the releases have been fast and contain many new features.


"Stifling" could refer to technical aspects of the language and its ecosystem or it could refer to adoption and marketshare.


This was a horrible article. With plenty of jumping to conclusions. I agree that Python 3 divided the community., but this article grossly over-exaggerates.


Basically the author's argument boils down to "if you have a long list of dependencies, some of them will not be available in Python 3 and therefore your program will be hard to port." That is true, but it also reflects an all too pervasive problem in modern open source software development - an over reliance on third party dependencies - "dependency hell." We've all seen a gem file or requirements.txt that is absolutely absurd. People need to stop relying on little libraries created by some random dude 3 years ago and never updated for their production code.


And how do you feel about forking and maintaining those libraries? I think the 'little' here means 'arcane'. If the code serves a solid purpose, and you understand it enough to maintain it, it should be better than writing it yourself.


I had both versions installed(arch linux). I wanted to remove one of them long time ago, but there are lots of dependencies on python2 in arch (gdb mercurial etc.).

python3 do break many of 3rd-party scripts I am using, so I make /bin/python symlinks to python2. Each python3 update will break these scripts and I needed to symlink python again. So, finally, I removed python3 completely.

These 3d-party python codes are well maintained, but the authors seems not to bother to port them to python3 if python2 is still available.


People forget that doing backward breaking changes take a while. Python 3 was released at the end of 2008, 2 years after Java 6. How many systems you know still run on Java 5?


Java 6 did not break backward compatibility (code written for Java 5 generally runs fine on Java 6).


Code written for Java 2 runs fine on Java 8, with extremely limited exceptions. Java, just as Windows, is used too much in enteprise apps to lose backward compatibility.


Yup. Biggest roadblocks in java update are bytecode manipulation libraries and at the moment maven.

I'm still intrigued by this situation: I'm running my software on java 8, but I can't compile it as java 8 unless I do that by hand (... really?)

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: