The source code of the packages sit on my computer, I can (and sometimes do) modify it, and I can still compile the same function to SIMD or GPU.
It should also be possible with Node.js I think (node_modules).
Actually translating Ruby code to Julia was much easier than I thought (the only real difference is the indexing).
The 1 based indexing really sucks in the PTX assembly output of Julia as well, I see a lot of useless increment and decrement operations when I don't expect it.
Like, when I first saw how you work with matrices in Python, I had a stroke.
I dont really get why we use Python for everything. The matlab style syntaxt is infinitly better for ML, stats and math.
Syntax is a secondary concern for all but short-term, small-scale programming.
When it comes to making scalable software systems (scalable in every sense of the term, not just performance but also feature growth etc.), syntax is really not the biggest concern; as such, optimizing for syntax can be detrimental to other more important aspects (e.g composability, reusability, programmability), and focusing on syntax means you're not paying attention to the really critical stuff.
I mostly work in Clojure where _all_ operations use prefix syntax (e.g you write (+ A B C) instead of (A + B + C) for addition, and + is just a function). Trust me, it's unfamiliar at first, but it makes reasoning about the program MUCH saner than all the special cases and irregularities that language designer bake into the language just to make some operations infix.
In Haskell, you can turn any function into an infix operator with backquotes. As a qualitative, nonfunctional stylistic, ergonomic opinion, I find Haskell much more flexible and beautiful than the ugliness of most brace-, paren- and prolixity-heavy languages. It can present too many features for pragmatic use, orthogonal to something in another domain like Rust or Go for effective software production where features are intentionally constrained or disallowed.
Actually Lisp uses ordinary old polish notation. Reverse Polish Notation is used by stack-oriented systems like HP calculators and Forth.
(- (+ 1 2 3 4 5) 1)
- + + + + 1 2 3 4 5 1
The first one seems much more readable to me...
And parentheses don’t serve precisely the same role in Lisp syntax than they in the more free-form world of maths.
One line of code is just enough to get something that works
(defmacro infix [a op b] `(~op ~a ~b))
There are libraries for that too. Clojure: https://github.com/rm-hull/infix
But 99% of statistical applications I have seen differ.
For someone using math, "programmability" and the other aspects you mention are certainly higher in a language that emulates how we think about the problems. While indeed Clojure may work well for some areas in math, Julia is certainly appropriate for statistics and probability - in my opinion the best -, and finally, Python isn't very good for either.
Everything else is almost always secondary. I would also argue that Julia is absolutely fine for developing large, scalable programs, not worse than Python.
Since Clojure has no statistics / ML ecosystem afaik, I am not sure why you mention it. This is absolutely crucial to even being considered in my point. That is why R dominates statistics still, by a large margin.
I mentioned it to illustrate my point, that semantics and other aspects matter much more than syntax in many cases.
(Btw Clojure does have a ML / stats ecosystem, in part via the JVM, though certainly not as developed as Python / Julia / R. For instance, Anglican is a probabilistic programming language embedded in Clojure: https://github.com/probprog/anglican).
> For someone using math, "programmability" and the other aspects you mention are certainly higher in a language that emulates how we think about the problems.
Sure, but syntax is just about notation - semantics are much more important to achieving nearness to a mental model. If you can't express your mental model in another notation than the one you're familiar with, then you probably don't have a very deep understanding of said model.
Continuing with my example, prefix notation for matrix multiplication does not hurt _at all_ my ability to reason about linear algebra - it sometimes even clarifies it.
I also think you misinterpret what I meant by programmability, which is not the same thing as 'ease of programming' - more like how smoothly various parts of a program interact. If for the sake of syntactic sugar you've introduced a proliferation of different programming constructs with no unifying abstraction, then the other parts of the program will need to make a proliferation of case distinctions as well - that's one way to hurt programmability.
Clojure's ML/stats ecosystem is moving fast. Several important libraries are under construction and will mature in few months. Imho, it is worth following this year, for anyone interested in languages for ML/stats.
In addition to probabilistic programming libraries such as Metaprob and Anglican mentioned above, here are some libraries worth mentioning:
Anglican is implemented in Clojure, and can be extended (by writing new Clojure code) to support new general-purpose inference engines. Creating those extensions requires an understanding of both the statistics and the PL concepts used in Anglican’s backend; you are essentially writing a new interpreter for arbitrary Anglican code.
Gen provides high level abstractions for writing custom inference algorithms for _specific models/problems_ (not entire general-purpose inference engines). Those abstractions don’t require reasoning about PL concepts like continuation-passing style, nor do they require the user to do any math by hand. Of course, since Gen is just Julia code, you can still reach in and implement new inference engines (just as in Anglican/Clojure) if you’re an expert. But I wouldn’t expect people who are not probabilistic programming researchers to do this (in either Anglican or Gen).
As an example, I know of a Monte Carlo nuclear reactor simulator that was written in tens of millions of lines of Fortran over a period of 3+ decades by non-software engineer nuclear engineers.
Also, have you used an HP 48 calculator? You can snag an emulator app and a ROM to use it on most smartphones, and it does integration and derivation.
You can't dodge knowledge with frameworks, only use frameworks to effectivize your knowledge.
(e.g.: text books, practical applications, introductory articles)
The Design and Implementation of Probabilistic Programming Languages (https://dippl.org) by Noah D. Goodman
Stanford CS 228: Probabilistic Graphical Models
https://cs228.stanford.edu and book by Daphne Koller http://openclassroom.stanford.edu/MainFolder/CoursePage.php?...
ProbTorch: Library for deep generative models that extends PyTorch https://github.com/probtorch/probtorch
Anglican: Probabilistic programming language integrated with Clojure and ClojureScript https://probprog.github.io/anglican/index.html
Ranked programming is like probabilistic programming but you don't use probabilities. Instead, you state how your program normally behaves and how it may exceptionally behave. Conceptually it's very similar to probabilistic programming, but the underlying uncertainty formalism is replaced with ranking theory.
You can find an implementation of this idea (based on Scheme/Racket) here:
For more detailed information check the paper linked to on that page.
- Probabilistic Models of Cognition https://probmods.org/ by Noah D. Goodman, Joshua B. Tenenbaum & contributors
- An Introduction to Probabilistic Programming https://arxiv.org/abs/1809.10756 By Jan-Willem van de Meent, Brooks Paige, Hongseok Yang, Frank Wood
Right now people solve the problem with frameworks like Keras so the programmers have easier time to express themselves (for example designing neural networks). Imagine you have a new programming language on which it is much clearer and easier to create a neural network (in Deep Learning).
I'd personally at a minimum try out Julia if I was doing data science often. Don't know enough about Nim enough yet.
(is this article paywalled? I'm on my university's network so I don't have a paywall)
The syntax doesn't seem to be changed compared to Julia (so is it actually a new language or just a library? Not sure, haven't seen Julia code in a while.)
And the semantics are the same with some additions. This is a good thing as it allow interop with Julia code, low mental overhead and shows the power of Julia :)
I wonder if the HN crowd think this system is AI ?
It seems to work to get exposure for work being done there.
A "new language" versus "a DSL built on Julia".
"Swift for Tensorflow" is still Swift.
I wonder why.
“Building off concepts used in their earlier probabilistic-programming system, Church, the researchers incorporate several custom modeling languages into Julia, a general-purpose programming language that was also developed at MIT.”
What they really mean is "new probabilistic graphical model language." Yet another BUGS/JAGS/Stan like system.
"New AI programming language goes beyond deep learning General-purpose language works for computer vision, robotics, statistics, and more"
I'm pretty sure most sane people agree this is a giant pile of meaningless marketing horse pookey, which is par for the course with MIT these days. That's what I was responding to. And characterizing it as "like stan" is a lot closer to descriptive than anything in that article. I picture whoever writes this stuff for MIT as wearing tap out t-shirts with neck tattoos.
"The, install the Gen package with the Julia package manager. From the Julia REPL, type ] to enter the Pkg REPL mode and then run: pkg> add https://github.com/probcomp/Gen"
Edit: https://medium.com/tensorflow/an-introduction-to-probabilist... There is support in tensorflow for probabilistic programming. How is this any different?