A bit curious that 3.4 has been dropped, as that's the stock Python 3 shipped for CentOS 7.
I'm really upset at the distro vendors, they should have upgraded their base system Python scripts to 3.x years ago and just included 2.7 for legacy support. RHEL 7.0 came out in 2014 for goodness sake.
Not to sound condescending... But why does this single default matter that much?
I'm sure you install lot of things onto these machines which isn't there by default, right?
Why is installing Python 3 on top of all those other non-default packages such a problem?
Not sure how you see a system being deprecated in 2020 as perfectly fine.
14.04 (2014): Python 2
16.04 (2016): Python 2
18.04 (this is obvious but... less than 12 months ago!): you guessed it, Python 2 (!!!)
So, still in 2018 and until 2020 (when I guess the next LTS release of Ubuntu is scheduled) all instructions over the internet telling you to run "python" or "pip install" will run Python 2 if copied into your Ubuntu terminal.
I already said this in a previous discussion about Python 2 vs. 3: what a lost opportunity.
If python2 was python2.2, it wouldn't support `with`. It's just that python2 has been Python 2.7 for a decade, so it's been relatively stable.
I guess Ubuntu has it that way to stop the breaking changes however
New code has no problem explicitly requesting python3, old code needs changing to request Python 2, but if you touch it anyways it's of course safer to make that change.
And, based on my experience, general open source is usually much more cutting edge than the larger distros, that’s the whole point.
What package are you referring to? As I understand it there is no "stock" Python 3 in CentOS 7. There is just what you can get from secondary repos, like EPEL, SCL, and IUS. Those repos now have 3.6 available.
I need to reach out to the cmake3 maintainer because they just added it as a dependency and I have no idea why.
[root@localhost ~]# python --version
Will that awesome async stuff help us migrate several 100s of thousands of lines of 2.7 code?
Yeah, right. There's a reason even companies with tons of resources like Google still use 2.7.
But good news, you don't have to do it. Python 2.7 will still work after 2020. We, the community, will just stop working on it for free. You can start paying us, or do the job for free like we did since 1991.
You can also migrate to another tech, but they will break things as well, and they won't give you 12 years to move. Ruby and node gave 2 years. Node forked 2 times. And it's a google based tech.
I'm tired of hearing complains, given how good you have it.
It took Flask several years, complaints, and a custom change for their use case, to port over.
Some people have actual production code which they don't rewrite for the fun of it (and even less with comically underestimated "2 weeks per 100K" runs), because the core decided a non-backwards compatible version is the future.
>But good news, you don't have to do it. Python 2.7 will still work after 2020. We, the community, will just stop working on it for free. You can start paying us, or do the job for free like we did since 1991.
Well, you're not the "community", at best you're one committer. The community (not necessarily the core devs) will fork and maintain way beyond 2020, and it will be for free too.
>I'm tired of hearing complains, given how good you have it.
Sorry, didn't know people must walk on tiptoes lest they tire you with their complains...
I think the overly backward-compatible way in which the change was managed really prolonged the adtoption and getting python 3.x to mature. The Python-Community was hanging between those versions far too long and it really hurt the ecosystem and the whole python-experience.
> Some people have actual production code which they don't rewrite for the fun of it (and even less with comically underestimated "2 weeks per 100K" runs), because the core decided a non-backwards compatible version is the future.
Of course, but you (probably) still have to maintain the code. And it's not a full rewrite, more an adaption. I find most changes to be quite mechanical. Also, it's not the python-comunities fault, you chose python and you're getting probably using it for some business purpose. So i think the python community can expect you to maintain it, including language adaptions. You can still ignore this and don't update, but libraries will probably start getting incompatible and there will come a time when things break. Wheter that's important enought to migrat is up to you.
I don't think it's fair to just complain loudly, after all, the python community is not expection a full rewrite. They have given enough time to migrate, i think took much, but it's still a lot. Being completely backward compatible forever was never promised, and i think is an complete anti-feature. Write once and never think about it is not the python-way, but having a clean, smart, elegant language. It also really hurts java, which just walks with all the backage of the past and glacial speed for language-updates, which probably is required when everything you do is going to be kept forever in the language.
However, I understand that those benefits might not be motivation enough to migrate in some contexts.
In those cases, I advise to freeze the project in time, isolating it by vendoring dependancies and either compile it with nuitka or distribute it in docker.
I would go so far as to say that Python 3.4 was the first one you could really use in production, that leaves you with 5 years to handle the upgrade. Still before that you should at least have started to add basic stuff like importing print_function and at testing all new stuff against Python 3. Many just kept ignoring Python 3 and wrote line after line of new incompatible Python 2 code.
I no longer feel sorry of those who are stuck on Python 2.7 and now have less than a year to upgrade. At this point it's your own fault.
You'd need insane test coverage, of all possible code paths, to be 100% confident.
Most of the conversion difficulties were related to Unicode/text/binary string handling in Python 3, and 2to3 didn't catch all of them. This was the biggest challenge of the entire process. Stuff would fail in production because due to improper string handling. Python 2 was remarkably permissive (i.e. loosy goosey), whereas Python 3 is stricter and arguably more correct, but this strictness has a cost.
The other class of problems is the restructure of certain std libs, like urllib. We took the opportunity to move away from "urllib" to "requests".
This site  was invaluable in understanding the migration issues.
All in all, apart from breaking Unicode issues, the migration process was fairly easy.
In the end I decided moving to Rust was less work than Python 3 (in terms of being sure my program was reliable and wouldn't error out with unicode errors).
That means the code was working with raw bytes, not utf-8 strings, so that's what you should convert to on Python3.
That means using `.encode()` and `.decode()` or using bytestrings.
Python3 doesn't break string usage, it makes you do it correctly. Expect a lot more of that kind of pain by moving to Rust (not saying it's not worth, you'll indeed get more correct programs done).
I found it much easier in Rust, as the two types are just distinct, and an incorrect program just fails to compile.
3.6 has been dev and test since 2016
The next stable release (Buster aka Debian 10) is just starting the freeze process now and expected to come out this summer - Debian is in practice on a ~two-year release cycle, and regular support for each release lasts one year after the next stable release comes out. Buster will have Python 3.7 as default.
I already have apps that won't install unless 3.6 is there
Somehow I feel like every system that I want to keep updated stays on an old version, and every system that I want to keep pinned to some version just force-feeds me its updates...
There is backports, and I've thought about maintaining a newer Python interpreter in backports (when I was working for a company using Jessie and trying to use a Debian stack for everything and sort of wanting async syntax). You could introduce python3.6 or python3.7 as a new package in backports. But it would be extremely hard to update the Python modules in Debian to the newer version, too, because there's just a "python3-foo" package for each module, so you'd either have to recompile all binary modules and patch them to work on the newer Python as necessary (and patch anything using async or await as regular identifiers....), or you'd have to introduce an interpreter that intentionally wasn't compatible with modules installed as Debian packages. (Or you could introduce an interpreter that tried its best and failed randomly, which seems worse than either option.) I'm personally sympathetic to wanting to get /usr/bin/python3.6 from my OS even if the only useful thing to do is put a virtualenv on top, but I think it makes sense to just install Python yourself into /usr/local at that point.
Re latexmk, it looks like the volunteer maintainer hasn't updated it since 2015, and there's a culture of not stepping on people's toes and waiting a while before assuming they're inactive or no longer care. But it's been updated in the development release. The current changelog https://metadata.ftp-master.debian.org/changelogs/main/l/lat... has a bunch of "NMU" entries, non-maintainer uploads.
No, Debian stable doesn't get bugfixes, only if those bugfixes are for security issues! (or some really severe bugs)
Example: Debian's Python is still 3.4.2, not 3.4.9!
Re latexmk: so am I getting the development release on Arch? I get the 2018 version with texlive-core.
Ah, yes, the standard bureaucratic cover for abuse: retroactively justify it with formal committee action, by asking people to submit examples of badthink and badspeak from years ago:
2012: 1 email
2013: 7 emails
2014: 5 emails
2015: 1 email
2016: 0 emails
2017: 2 emails
2018: 2 emails
Downvoters: If you're honest, then do your homework instead of downvoting. If you're dishonest, and you're trying to cover it up, shame on you--you're as bad as the abusers in Debian.
But I wouldn't want Debian to change their release behavior. And to be honest regarding different versions: Python has always been a special needs kid in this regard.
The NumPy 1.16.0 notes say
> Support for Python 3.4 been dropped, the supported Python versions are 2.7 and 3.5-3.7.
That definitely sounds like NumPy 1.16.0 doesn't support python 3.4.
seriously. I think we have to treat it as legacy code now.
Seriously, it’s been 10 years since Python 3 came out, Python 2.x has been supporting legacy code since then.
And let's not understate what a big deal it was to break compatibility like that. Porting a sufficiently complex project is non-trivial.
My company has >1 million line of python 2.7. About 1.5 year ago we managed to pass our test harness (about 80% line coverage) in python 3. At the moment we can't do that but we believe if we tried hard for a few weeks we can do it again too. I don't know how we define "legacy code" but I code in this codebase everyday and the code is actually very pretty and understandable. We also have a python 3 project (only a few thousand line of code) and god that code is definitely ugly as shit. Even after python2 goes unmaintained, I will still think our python3 code looks more like legacy.
Can you please give some examples? I've recently picked up Python 3, wondering what well known traps I should avoid.
Strongly wishing things to die doesn't make it so...