
New IPython release drops Python 2.7 support - peterdemin
http://ipython.readthedocs.io/en/stable/whatsnew/version6.html
======
makmanalp
The 5.x fixed a lot of my UX quibbles with the completion menu, and made
syntax highlighting, and now they're further improving the completion quality
and the menu. I'm so excited!

~~~
sandebert
And you just can't hide it.

~~~
hyperbovine
It's a shame that humorous comments are so consistently downvoted here. Are we
all supposed to be post-laughter now or something?

~~~
pekk
Low-content jokes never did particularly well on HN. Like it or loathe it,
it's been a cultural feature here for years.

~~~
hyperbovine
I don't remember it being this bad maybe ~8 years ago.

------
zitterbewegung
I use Juypter and the iPython nearly every day. The new autocomplete looks
extremely promising. If you are running Python 2.7 it may not be a big deal to
just use iPython 5.x though.

The two major uses I have are prototyping nearly any coding project that
requires python and teaching myself data analysis. This has saved me hours if
not days due to the fact that the feedback loop is so fast. When I code in
other languages I desire the iPython interface.

~~~
jzymbaluk
>When I code in other languages I desire the iPython interface.

It's not specifically IPython, but Jupyter does support many other languages
through kernels!

[https://github.com/jupyter/jupyter/wiki/Jupyter-
kernels](https://github.com/jupyter/jupyter/wiki/Jupyter-kernels)

------
breatheoften
A comment from a beginner python user -- but I need to rant ...

I've been doing a bit of python the last few weeks for some image
processing/computer vision tasks (using opencv and numpy).

I have to say, all together it's a pretty miserable developer experience.

Python is incredibly slow - forcing pretty much all computation no matter how
trivial into contortions via numpy incantations so that the inner loops can
run in native extensions -- these Incantations have a lot of implicit not well
documented magic. Miss some details in the behavior of the magic and suddenly
you have a major 10x slowdown -- but good luck finding where. I would kill for
an easy to use tool like xcodes Time Profiler ...

API usage errors (even those where invariants are checked at runtime) are
ridiculously under informative -- opencv for example does quite a bit of
runtime sanity checking on the shape and type of arguments to its methods --
but somehow even simple details as to which parameter is the cause of the
error don't get reported in the stack trace severely increasing the cognitive
load required to identify the mismatch -- not fun when multiple arguments to
an API are the result of a chain of numpy magic data munging. This may be an
opencv complaint more than python (aside: opencv is pretty terrible.)

I'm not sure what I'm doing wrong with python but I find the majority of my
code to be sort of menial data munging -- and I haven't figured out good
patterns to organize this munging in any sensical way --with a static language
d.r.y patterns to centralize such plumbing operations have the awesome effect
of moving invariants into reasonable places -- in python without any ability
to organjze guarantees, as the code base evolves I find myself needing to
repeatedly check data shapes/types -- there doesn't seem to be an obviously
useful way to organize verification of data types as the necessary invariants
become apparent. These issues are compounded by the fact that refactoring is
an enormous pain in the ass!

I feel like all my python code is throwaway code. Maybe that's what I'm
missing -- I need to just accept that all the python numeric code I write is
pure one-off junk, embrace copy paste and never try to reuse any of it ...

Sorry for the rant! I remember loving dynamic languages when I first
discovered them - but right now, I really miss c++ (or even better swift).

I can't imagine the number of hours wasted because of these overly dynamic
tools -- and there is simply no decreasing that lost time in the future -- as
these languages grow if the house of cards ecosystems they sit atop grow and
motivates more use then ever more developer hours will be lost to avoidable
triviality ...

~~~
bmarkovic
The idea of using Python for high performance numerical work is daft. The
whole scientific Python thing revolves around the fact that Python is an
excellent low boilerplate research and prototyping language with possible
production uses where performance isn't critical. Unfortunately this lead to a
lot of software in this area being developed for it. You wouldn't be doing
performance critical production stuff with R or Matlab? I'm afraid that for
serious numbers crunching nothing can truly replace compiled, statically typed
languages and additionally you can't rely on GPGPU abstractions without fully
understanding the underlying mechanisms and I'm pretty sure that the same is
true for NumPy. Your problem are your expectations.

~~~
yongjik
There's "not for performance-critical use", and then there's "why is it
gobbling up all the RAM (~10GB) and make it impossible for me to do anything,
when the entire data I fed it is several hundred MB."

Python frequently saunters into the second territory. Well, I guess there are
tools available to profile memory usage and stuff, but if I had to spend that
much effort tracking down memory issues, I might as well rewrite it in C++.
(It doesn't help (or it helps?) that I'm much more comfortable with C++ than
Python. YMMV.)

~~~
jacquesm
There are some easy to avoid pathological cases such as concatenating numpy
arrays in a loop.

Take this gem from a well known course:

    
    
       np.concatenate([x.next() for i in range(x.nb)])
    

That looks pretty innocent but it can eat up your memory in an eyeblink if the
input is large enough. That's the sort of pitfall that a lot of python code
suffers from because the abstractions are _just_ nice enough to make you
believe this will work without penalty and without knowing how it is
implemented under the hood you're suddenly out a few gigs of ram.

~~~
ofek
If np.concatenate accepts any iterable, removing the brackets is what you want
to do there.

~~~
jzwinck
That won't help. Concatenate needs to know the total length up front. It will
evaluate all the input items at once.

------
gcr
I understand the notebook machinery itself is running on Python 3, but can I
still launch a Python 2.7 kernel? I am still working with Python 2-only
libraries at the moment.

~~~
goerz
Yes you can. Kernels are completely independent from the notebook itself.

------
bjt2n3904
It'd be really nice if they followed python3, and named it ipython3. This is
just going to confuse install processes if I now have to install "ipython<6".

~~~
WorldMaker
That's not necessarily a Python thing, but a Linux distribution thing or
whichever package manager or installer you are using. I like all my Python 3+
installs just called python, personally.

~~~
Nullabillity
It _is_ a Python thing, see
[https://www.python.org/dev/peps/pep-0394/](https://www.python.org/dev/peps/pep-0394/).
And changing the definition of `python` is just going to cause issues again
once Python 4 eventually comes around.

------
708145_
Wow, I thought that this would never happen.

------
Chris2048
I personally think they should have just declared Python 3 to be a new
language, inspired by python 2, like many other competing new-pythons.

Instead, they didn't play fair, and gave themselves an unfair advantage.

~~~
scrollaway
Damn that Python team, giving themselves an unfair advantage in developing
Python.

Seriously, what am I reading here?

~~~
krick
Maybe this is a joke, and then I'd say it's pretty neat one. I mean, just look
at that, what other language got a similar problem of that scale! And let's
not pretend it all passed and we are happily moving forward: this year (this
month, actually) I've seen very prominent NN-course, suggesting Python 2.7 for
all assignments and code samples, and quite useful library (also NN-related),
supporting 2.7 only. I guess, some of mitsuhiko's (Ronarcher's) developments
didn't move to 3.* as well.

So, yeah, IPython dropping 2.7 is pretty huge. Almost like moving to a
different language.

~~~
jerryszczerry
> I've seen very prominent NN-course, suggesting Python 2.7 for all
> assignments and code samples, and quite useful library (also NN-related),
> supporting 2.7 only.

Oh, that's plain silly. Isn't the whole point of Python 2.7 to facilitate the
transition to Python 3?

~~~
krick
Silly or not, but it's still very common mindset with scientific-oriented
people (who care about ML, econometrics, statistics and such much more than
"programming" per se), who form a quite notable category of Python users.
Actually, for me they are the reason why I'm still using Python a lot
(although I try to stick to 3.x), because for other use-cases, like scripting
or web-dev I mostly (but not completely) moved to other languages.

