It's changed the way I work (and blog)
You can combine multiple languages in the same notebook, and they can communicate.
Interesting tweet from one of the IPython/Jupyter devs last year about Beaker. Jupyter and Beaker state similar missions, although Beaker seems to have focused more on "multiple languages within a single notebook" leveraging the IPython backend. Sounds like the two projects can both co-exist due to the slightly different emphasis influencing each project's trajectory.
This switch is happening in iPython 3.0 and still the lack of public information is concerning for me. It feels that Jupyter has loss steam????
Beaker has autotranslation for communicating among languages, and you can have multiple languages in one notebook.
Four months ago someone asked for compare and contrast:
Since then we have fixed a ton of bugs and fleshed out the concept. The reflection API and JS scriptability are about to be released, along with a bunch of UI polish, performance, and all kinds of fixes.
Yes you can get better tables with IPy if you remember to load a lib and call a function. With Beaker it just works by default.
Ditto for sharing, Beaker has one-click sharing to the web built in, with IPy you have to load an extension.
On the Mac, Beaker comes packaged as a native app that you just drag to Applications and the Dock.
That said, I think Mathematica does a much better job of notebook style programming. You can do some truly amazing things manipulating the Mathematica notebook. The language itself is also pretty nice, something like APL flavored lisp with M-expressions instead of S-expressions. It isn't without its flaws, but it is one of my favorite tools in the toolbox (along with Python, C++, Haskell and Fortran).
Great for previewing graphs and copy pasting and executing things out of order.
Even if it has one pretty large output it works quite well. However, if you `print` each intermediate result and have a few hundred of those you get a problem indeed, there is a need for some kind of overflow protection here.
The win with Python (and other dynamic languages) is that you can experiment quickly with ideas when you're formulating a solution, that's a big part of exploratory data science.
If you're curious about high-speed work in Python - Radim did a blog series on how he sped up word2vec to be faster than Google's original C code: http://radimrehurek.com/2013/09/deep-learning-with-word2vec-...
I'll also note [self promo!] that I wrote on book on High Performance Python, if that's your cup of tea (and Radim wrote a section in it): http://shop.oreilly.com/product/0636920028963.do
I cut the marketing speak down to minimum in my articles and tutorials, but if you're interested in cutting edge machine learning & no-nonsense data mining, get in touch! I run a world class consulting company, http://radimrehurek.com.
And in my experience, very hard to reproduce after a couple of years. With enough discipline, it's obviously possible to make well-structured Python programs that will last. But in practice that rarely happens with scientific software written in Python. Usually, there are many external dependencies, it's fragile (no static type checking), and platform-dependent (usually OS X or Linux). To add to the mess, most scientists like to hardcode paths to the input data, etc.
Although I am not a fan of Java, I usually don't encounter the same problems with older scientific Java software. If it's Mavenized you are usually ready to go after a 'mvn compile', otherwise, you just dump the project structure in an IDE and it usually works.
(The plague with scientific software in Java is that it is often not thread-safe.)
Also, I think the quick experimentation is not limited to Python and statically typed languages with a REPL can also provide that (Haskell, OCaml, Scala). And since Go was mentioned: since compilation time in Go is usually near-zero, it's the same.
Well, let's be honest with ourselves... this isn't limited to Python. Scientific code that isn't a mess is almost nonexistent. For a lot of scientists, writing code is totally secondary and many simply aren't skilled programmers (nor should we necessarily expect them to be).
It is however deeper than that. As a graduate student, I was involved in a government initiative to write a high quality large scale code package. This was (still is, the program just got extended) a well funded and well organized effort with hundreds of people, including literally dozens of people who can legitimately claim to be the best in the world at their specialties. This included some genuinely amazing computer scientists and software engineers who enforced well planned coding practices.
And yet, the code is still far from ideal. A big part of this is its scale - millions of lines of very technical numerics code and libraries all working together. Most of what I consider to be the toughest work was on integrating various disparate pieces and unifying them under one common input structure.
Point being, even with effectively unlimited resources using rigorous development standards and statically typed languages (primarily c++11) there are still tons of issues. A lot of it is because of incorporation of older codes, which is inescapable in any non-trivial scientific code.
I've really enjoyed this book so far, so thanks!
Secondly, since a lot of techical computing involves multidimensional arrays, you want good support for them in the language. Which means some kind of array syntax such as Matlab, R, python/numpy, etc., and also that they are efficiently handled behind the scenes (one array instead of nested arrays somewhat popular in C code).
So in the end, there's not a whole lot to choose from if you're not willing to sacrifice any of the two above features. One language I'm excited about, Julia, is a bit special in that it tries to combine the high productivity of such high-level interactive environments with C/Fortran-like high performance. The language itself is really nice, IMHO, but of course the surrounding ecosystem is so far much less mature than that around scientific python.
That being said, I'm also excited about Rust and I hope it will have a bright future, also in technical computing. Though I believe where Rust would be most useful, compared to Julia, say, is for writing low level libraries that can then be used from any language with a C FFI, as Rust doesn't require a big runtime with GC and whatnot.