Yes, Python has a pretty dirty history, with many people choosing to stick to the Python 2.7 that they knew and loved. And yes, commercial software tends to move waaay slower than the wider community (many banks are still running COBOL). If you're focused on making money and pleasing clients then "it worked for us before" is always going to be the strongest argument.
Major players in the Python eco system have pledged to move away from Python 2 , and if we had non pie-chart visualisations, I'm sure we'd see huge trends towards Python 3 in the last year. Even slow-moving corporations are starting to use Python3. Yes, MacOS defaulting to Python2 is still a problem, but Ubuntu switching to default Python3 is already a huge step to get companies to move forwards.
And we have other business priorities we have to pursue. As long as Python2 is viable. Which it still is and probably will be for quite a while. (Even once official support for it is dropped, there will undoubtedly be large enterprises that rely on it and will continue to maintain forks.)
Still. 15 years. It's not a last minute change. Python has been incredible in maintaining the 2.7 around, giving people help and time to change. I know NO other tech that did that. The JS community breaks his toys every other week. PHP just jumped a version to avoid it. No one is migrating to Perl 6. Python is actually the only one that pulled this off, and have been very, very careful to give you the margin to do it.
Everything needs to evolve, and one day you will have to do it. Don't to it in 2020, or it will be way more costly.
And yes, you have many advantages in having Python 3, mainly ease of dev, debug and maintenance. This is not as hard as it seems. Took me a week alone to convert a huge system. Don't feel overwhelmed by the task, it's really not the end of the world.
Can confirm. Porting even huge code bases has become much easier lately thanks to recent changes in Python 3 (unicode literals!) and Python-Future. "from six import u" and many other hacks are no longer necessary.
Try it, and there's a huge friendly community there to help you :-)
When you have 3 engineers, one of whom is working full time on bugfixes and preventing everything from catching fire and one of whom is working full time on applying the second of 8 major versions' worth of upgrades to your framework (along with third-party dependencies that have to be version-compatible) just to catch up to a version that's still maintained with security patches?
When every framework/library upgrade, let alone language version upgrade exposes subtle backwards-incompatibilities with your external services, so you have to upgrade those too, and then spend 2 weeks understanding and solving the resulting forwards-incompatibilities?
Just because it was easy for you and your codebase doesn't mean it's going to be easy for everyone else.
I'm not saying we'll never doing it. I'm sure as hell not saying we (being the engineering team) don't want to do it.
I'm saying we haven't done it yet because we can't do it at all until we complete the other upgrades. We can't do it at all without spending a week or three months or however long finding or writing replacements for crappy (and in a few cases not so crappy) abandoned third-party packages that aren't and never will be supported under Python3. We can't do it until we can spare 3 months from our timeline to exhaustively click-through test the damn thing and verify we haven't broken anything.
And saying we had 15 years to migrate is bullshit. Our major dependencies haven't been stable on Python3 for more than about 3 years now.
By the way, did I mention before that we have other business priorities? Python2 is still supported. It'll continue to be supported for a few more years. Why should we spend valuable time converting to Python3 now, when the other things we're working on could be the difference between whether or not we're still around in a few more years?
No, it won't be more costly in 2020. It'll be less costly. We'll have a larger engineering team because we solved more business problems and customer problems, so the business earned more money and could afford to hire more people. Or we'll already be out of business and it will have had nothing to do with which version of Python we were using.
No tech can be asked to be calibrated to compensate 20 years of bad project management.
Yes, our code base is a giant fucking disaster. I wouldn't even begin to try to deny that.
I'm also pretty certain that there are others in the same boat to a greater or lesser degree. I've seen too many codebases that were the product of starting with a team of one or two and a mandate to move quickly. It's impressively easy to dig yourself into this sort of hole, and it's impressively common among startups in my experience.
I'd imagine that large enterprises are even more likely to have even larger legacy codebases without full test coverage. They may have more resources to throw at the problem but are also likely to have more political obstacles and even higher standards for verification that they haven't introduced new bugs.
But the GP was insisting that there's no reason not to have upgraded yet other than sheer stubbornness and unwillingness to learn new things, or even laziness on the part of the engineers. That's incorrect and insulting.
Just because the language is old doesn't mean that it can't be the right tool for the right job. For batch processing-type tasks, there nothing better than COBOL on the mainframe. By default, COBOL doesn't allow for you to dynamically allocate memory. That makes it incredibly easy for the compiler and system to optimize the compiled code and make it run super fast.
COBOL is great for what it was designed for: processing data. If you want to read data from a file/database, process it in some way, and then save that data back to the database/another file, you can't do much better than what COBOL offers.
The other 3 hours are in Python, Go and JS&cie.
But regardless of the packaging effort, the real overhead is tracking security for the [overlapping] LTS releases.
On a data level: I wouldn't be surprised if most of theese projects are either not actually new (ie. only added to their CI system, codebase is older) or is related/interacts/integrated to/into a Python 2 codebase, which arguably isn't a from-scratch project, either.
2.7, 3.3, 3.4, 3.5, 3.6:
There's also the sentiment of "I don't know if [insert module here] will be supported", which has become mostly baseless fear, but people still think Py3 support is lacking (when it's not!).
 https://python3wos.appspot.com/ -- also, take notice how 9 of this guys come from here 
Edit: when I posted this comment, the link was titled "Python 3 largely ignored" (not the article per se, but the submitted link had been titled that way). It has been changed, but this was a bit of important context for my comment.
Here are the blocking issues for official support: https://github.com/Supervisor/supervisor/labels/python%203
I've been running it under Python 3 for a while with no problems.
In your case you know exactly what your needs are, and that they're unsupported in Python 3, but I'm sure that many of those 70% Py2 projects could run just as well in Py3.
Can I ask why you're using supervisor instead of systemd at this point?
Systemd on the other hand is just something that is being forced down our throats. :-/
supervisor: Application supervisor, not a widely used library - it's probably in the top 360 since people install it using pip instead of the distro's package manager
carbon, graphite-web and diamond: Components of the time series DB Graphite, see supervisor
Thrift: Apparently there's a Python 3 pull request? 
ansible: Difficult story since they have to support old systems which only support Python 2, so no point in migrating to Python 2 until those are no longer in use (that includes Python 2.4!).
This is a really unique use case for Python so it's hard to compare it to the rest. The part of Ansible which runs on the controller has been ported already .
google-apputils: There are Debian packages for Python 3, so probably missing the PyPi metadata . Appears to be unmaintained.
aspen: Niche web framework. They seem to be working on Python 3 support .
newrelic_plugin_agent: see supervisord
python-cjson: Unmaintained drop-in JSON library that can be replaced with something else with no effort .
python-openid: There's a Python 3 fork which looks nicer than the original .
m2crypto: Porting effort underway, but mostly an issue for existing cde - there's no reason to use it for new projects since python-cryptography is better in all regards
python-ldap: Issue for many enterprise Django projects (this is the one library in the list that I personally use), but there's a Python 3 fork so it's fine .
INITools: Abandoned, it used to be pip dependency so that's probably why it's in this list (?).
marionette: see supervisord
pathtools: moved to stdlib in Python 3.
So, nothing major left in the top 360.
All of the huge blockers like Twisted and Paramiko are now green and I migrated all my stuff over.
The Fedora porting project has some great insight into how it's going in the real world:
OpenStack is interesting, too: https://wiki.openstack.org/wiki/Python3
In general, no major libraries left, but lots of grunt work that needs to be done.
Some stuff not on the list, that I encountered:
python-dajax: Turns out it's a bad idea to put frontend logic in your backend code, but that doesn't make the dependency go away (great opportunity to finally undo that mistake in my code)
wxPython: They decided to do a full rewrite instead of a port so it takes some time. Qt, I guess?
The only thing I'm dealing with which is still not Python 3 compatible is AWS Lambda, which only offers 2.7 right now. Supposedly Amazon is working on 3.x but... come on.
Haven't had a single Unicode bug in production since we switched to Python 3. Also, chained exceptions (extremely valuable in walled-off production environment).
Type annotations actually work and have prevented a number of bugs.
The asyncio syntax is really nice with Tornado (don't use aiohtto, use Tornado! it's much more mature)
With Python 3, that simply has never happened to me. Never.
I don't know why. I don't know what the change, what the secret sauce was. And what's even better: I don't have to care. Python now simply works for me.
So yes, Python did remove Unicode issues.
It's a bit sad to not be able to use some things that I really love about Python 3 but it makes me happier to see other people enjoying my work, so I'm going to bear with it for now.
Using asyncio has been a delight for a lot of the small, quick web services we need to write (though not a replacement for Twisted). Looking forward to seeing the asyncio infrastructure grow.
But ansible scripts tend to be like 5 lines, so it is not the same as the scale you would write a webapp or something in Python3...
Client side code has to run on everything starting with CentOS 6, so they have to stick with Python 2 for a long time.
For the Ansible master (which runs on the management machine), Python 3 porting work is already underway.
We use Python 3.5 and are VERY happy about it.
That's why Python 2.7 will be around and maintain dominance for a while.
When the latest stable releases of the primary environments are defaulting to something you have to have very good reasons why not to use the default (even when it's easy to change the default).
Administrators (in the old school not the new "everything in containers") don't like using non-defaults.
> for the time being, all distributions should ensure that python refers to the same target as python2.
edit: seems to be the case
> however, end users should be aware that python refers to python3 on at least Arch Linux (that change is what prompted the creation of this PEP), so python should be used in the shebang line only for scripts that are source compatible with both Python 2 and 3.
sed -i s/python/python3/ /all/the/things
This is Canonical's stance:
> All Ubuntu/Canonical driven development should be targeting Python 3 right now, and all new code should be Python 3-only.
Regardless, nowadays many people deploy Python applications as stand-alone bundles. We're happily running Python 3.5 on CentOS 6 and it took like ten minutes to deploy (pyenv!). This lets us use a rock-solid base system with an up-to-date application stack, which makes our old school admins happy.
$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 16.04.1 LTS
# lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 16.04.1 LTS
# python --version
The program 'python' can be found in the following packages:
Try: apt install <selected package>
# python3 --version
"for the time being, all distributions should ensure that python refers to the same target as python2."
I read above as the python community itself made "python" the default for python 2.7 and "python3" for python 3.x. Nevertheless, an unfortunate choice at this time. I understand the reasoning at the time of writing (March 2011) but now this should be reconsidered.
Say, you use containers and so your application is isolated from the host system. When the next version of your host system comes with a new version of Docker, you have to do similar testing and validation anyway.
If I was writing software that had to be used in as many places as possible or sold, I'd absolutely strongly consider using 2.7.
I understand the desire to have stable releases. But... if you're not shipping the most recent stable version of something, and instead shipping something that's 4-5 years old... it's frustrating for users and developers alike.
I understand there's obvious use cases for Go, but an amazing amount of what we do is taking data from place 'A' (web, socket, file), sending it to place 'B' (another web, socket, file), and then returning a result. Go doesn't buy you much that asyncio won't.
Static typing. Oh Lord is it nice.
When I wrote Python, I never quite knew if there'd be a runtime error (in my code or a library). With Go, there are a lot fewer of those.
All the jobs I see are Ruby + Go or Python + Go, which makes it pointless to learn Go if you don't care about Ruby or Python.
Pretty much, in my career. Sure, i know plenty of languages, but my day job is slinking Go.
I honestly don't want to program in Python any more. I never thought I'd say that, but it's true.
But we will be upgrading a large code-base from 2.7 to 3.x. For things like web data aggregation and parsing, it is hard to beat Python; the tooling in that space is really good.
Definitely looking forward to some unicode pain alleviation from the rewrite.
What date does 1/2-3 mean?
I've had raids cancelled when we agreed on 19:45, but showed up an hour late because he didn't realize that nobody plans raids in GMT.
Är det inte störande när folk börjar prata ett främmande språk mitt i sin post? Å it ens Google Translate hjälp nä dialekt kom i bildn. And as a non-native English speaker, I'd also much rather have the original English version of whatever you're doing, than the half-baked translation that's missing tons of features that are always "coming soon".
No, Unicode-as-default didn't cause this, but the switch shows a switch in culture towards making i18n easy.
You don't see value in meeting customers in their own language and culture?
I find it funny how a small group of people thinks they know better than the outer community, to the point that they feel like they should have a say in what thousands of businesses use to run successful code.
More than this, I would argue that most people using Python 3 are those new to the language. This is only from personal experience, so it's really just anecdote.
PS: In a kind of jesty side-note, I know the general argument is that "python 2 was broken", but really, how broken can something be when thousands of businesses depend on it and more than that, choose to keep using it when a "better" alternative comes about?
So for this feature alone, I'm all in on Python 3.
Recklessly breaking backwards compatibility without a smooth migration plan is hostile to developers. Although it's not a programming language, this is something that I think React has gotten right with their current versioning scheme .
But in order to make it happen, and to provide the long lead time for migration, 3.0 had to get out the door when it did. So it wasn't "premature".
Upgrading to 3 doesn't cause bugs, it reveals bugs that many people would rather brush under the rug.
Yes, it breaks a lot of things, but it's totally worth it for the performance gains.
Py3k solves problems partially like Unicode or async/await but it's a non-issue for skilled python2 developers.
People like incentives to upgrade. Period.
- clean Unicode support which prevents mistakes which used to bite us in production
- native async syntax, compatible with Tornado
- asynchronous generators (yield from a coroutine!)
- pathlib and os.scandir
- type annotations
- matrix multiplication
- chained exceptions
- faster dicts, ordered by default
So many useful features, and with the latest Python 3 releases porting has become really easy thanks to improved compatibility with the old syntax.
I think the flip around - ~30% adoption of a newest version at 1-2 years in - especially one that breaks backwards compatibility isn't bad.
I've started numerous python projects at work over the last year all on 2.7. I think we just started the last one. We're finally ready to jump to 3.
If async and this had been in Py3 from the start, they could've done a far better argument saying "hey, we broke all this stuff, but you get asyinc I/O and faster dicts" (a note of faster dicts: since many classes and language mechanisms use dicts, it should have an encompassing effect on all of python performance -- maybe not 30% or 40% faster, but a few % all around, which is kind of what micro benchmarks are showing with 3.6 betas)
What if I am not running into any significant Unicode issues for the past 6 years?? (insignificant == stuff I can fix in 10 minutes). What if async/await has been achieved through other libraries (e.g. GEvent, threadpools, whatever)... What is the incentive to switch a large (working) codebase?
A lot of bugs disappear (the API is unified, division is true by default, imports are absolute), or are easier to detect (clear exception if you compare incompatible types, bytes are very explicit).
Many duplicate ways of doing things are removed (modules are renamed, builtins have been cleaned up), the syntax is terser (no more object in class, no more from __future__, super() has a shorthand, no more # coding, not more u'').
It's easier to write lower memory code (most things are generators by default, yield from allow to delegate to sub generator).
Creating clean library is also much easier. Now I'm playing with type hints, and yes, it does help to avoid bug. "raise from" makes error reporting much better. No relative imports make organizing code much easier.
Basically everytime I go back to Python 2, it feels like a pain. So many little things I didn't notice, take for granted, but that adds up quickly.
The worst being of course to code a 2/3 compatible code and to have to do things like: https://github.com/Tygs/ww/blob/master/src/ww/types.py. But I do it only for libs that I want to use specifically to help me migrating project (ww wraps everything the same way in 2 and 3, making migrating easier).
I look forward for 3.6 to introduce f-strings, which I will adopt asap because it's so convenient.
Right now in my life the major things stopping me from using python 3 for all the things are.
* Two large in-house developed apps taken over from previous devs
* Debian Wheezy
Other than that I try to use python 3 as much as possible. I know a lot of the workaround for getting pyvenv working on various distros for example.
I also know about a lot of alternative libs like ldap3, dnspython3 and more.
I think about Python 2 as Ansible's dependency, not something I have to use because it's there.
Isn't that kind of a bad sign, though? Should it really require 'considerable effort' to use the latest release of a fairly widely adopted language?
But yes, it's a bad sign in the sense that more effort is required than with Python 2.
It's getting better all the time though. Most of the effort is because of a local Mac OS development environment, legacy Debian Wheezy systems that are hard to replace, but also legacy Ubuntu systems that are hard to replace.
But what do you mean when you say your system uses 3.5? Yes of course the system python can be 3, or 2 simultaneously but ansible still requires python 2 to run.
And I haven't tried this but I'd assume lib ansible can't be used by python 3. Maybe you can confirm.
Though all it takes is one dependency to throw a wrench in the plans, the major projects are Py3 now.
Though porting large python 2 projects is still a huuuuuge pain. This is more a result of python 2 badness than python 3. But a lot of work.
I don’t want to add a dependency for end users when my current code works fine with the built-in Python at 2.x. Nor do I want to bloat the size of downloads to embed something like an entire interpreter binary. (Swift has the same issue; there is currently no way to refer to it externally, as it is in flux; I do not want to embed loads of Swift libraries in every build so I will wait to migrate until they have stabilized binaries and made them available from a common root.)
/usr/bin/python will always be Python 2. See https://www.python.org/dev/peps/pep-0394/
Python 2 isn't installed by default on Ubuntu cloud images since 16.04.
> however, end users should be aware that python refers to python3 on at least Arch Linux...
That clearly means he will recommend /usr/bin/python migrates to python3 some day, and that today developers can already not count on it being python2.
That's as antagonistic to "python will always point to python2" as you can get.
$ lxc launch ubuntu:16.10 yakkety
$ lxc exec yakkety python || echo no py2
On a clean 16.04 install, python 3 is present, python 2 is not, however, you still need to call python 3 with "python3" 
Edit: See qznc's remark on the pep, apparently it's even advised the keep "python" pointing to python 2.
Yes. If you do a clean install of 16.04 then you won't python2 installed at all.
Point is, we're a hybrid shop that does all first pass services in Python 2.7 and then move them to Go when they become suffiently trafficked and/or critical.
When pika got a python3 compatible version, we ran out of excuses to not use python3, but still my colleague was worried that "it would take too long to port things, and our-code-is-working-so-why-bother". I decided to try it anyway, and it took me less than a lazy Friday afternoon to run 2to3 on all the packages and get the tests to pass.
Given that your work is with scientific computing and (presumably?) you have control over the deployment environments, I'd really recommend that you avoid do the switch sooner than later.
I stuck with that line, because maybe his colleagues are also waiting for "50% of colleagues using Py3" and it's just a deadlock until some few brave enough starts to port and tip the balance.
Python 3 also has a load of improvements that would make scientific programming faster.
Perhaps instead of thinking about not having spare time to port your code, that you're saving time for future you by not creating more work to port over when you end up really needing that mind blowing new library :)
Teachers and kids are already ahead of the game.
When today's kids graduate they'll view Python2.7 in perhaps the same way I view Delphi or VB6 ("you're still using that..?" etc,).
Ecosystem has moved far enough its just a matter of time when big frameworks etc. deprecate python 2.7 and this mostly happen around the real EOL of it.
I suppose the sample set (projects using a tool like Semaphore) is likely to be biased towards more forwards-looking teams, but still.
How many of the projects support both?
Python 3 had no carrots, as others mention it only seems like now it's got enough going for it to be worthwhile. That's the lesson even Microsoft forgets and relearns continuously: if you're going to push out a major upgrade, you better have nice carrots attached. The stick only approach doesn't work when there is competition, and Python has plenty of competition, lots of it just with itself.
I don't use Python for any serious development these days, but this is my impression as well. Python 3's promise was "we'll break all of your code in exchange for a few minor things some people think are improvements." When users balked, the devs decided to force the change by brow-beating and shaming users, with predictable results.
Perl 6 had lots of carrots, but was so different that the only plausible portability story was to run existing code in an embedded Perl 5 interpreter with some kind of interoperability layer. It's a new language, still looking for its niche. Ruby devs were so used to breaking changes that they considered porting to 1.9 to be more of the usual. Meanwhile, C and C++ happily coexist, despite major divergence by C++ and minor incompatible changes by both languages.
(And Perl 5 code can declare which of the newer features it will support. Excellent backwards compatibility.)
It does seem like take-up has been low so far, but it's probably still too early to call whether it will turn out to have been the right choice.
I'm less familiar with Python, so can't really comment on there, but it does seem that Python 2.7 had fewer egregious faults than Perl 5.
Then there is Devel::Declare etc with extensible syntax... :-)
Certainly open to my mind being changed, though!
Ruby 1.9 also fixed the accurate handling of strings potentially encoded with multiple bytes.
Fast forward to today, we didn't have such a turmoil like with Python and now, not even Ruby 1.9 is supported by the official devteam anymore. And there is nobody holding out on 1.8 anymore.
Compare that to Python 2 which will still be supported until 2020!
When python 3.0 was released both django and numpy where still years away from getting official python 3 support into their releases so no one bothered to invest too much time into it, and this caused a major loss of momentum that is only today starting to slowly return.
Essentially all string manipulation functions also have an equivalent function with the same name prefixed with "mb_", which works for multibyte strings.
The practical effect is that people use the default string functions and then things break when multibyte strings start showing up! It is backward compatible, but it's certainly ugly and that's quite a tradeoff.
In Java I am largely talking about library upgrade issues because Java made the decision to remain language backwards compatible. Something that has it's costs but also some real benefits. See Linux and it's promises to never break user space.
CPython is so slow, they really should start taking performance more seriously.
But the new dict and a few other optimizations in 3.6 will make attribute lookups pretty close to C++ speed if not the same.