Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Julia is a good language for computational simulation experiments but in a world that's heading towards heterogeneous compute enabled by GPUs, TPUs and intermediate representation languages/frameworks like ONNX, TVM [1] and Tensor Flow its difficult to take claim of "Language for Modern ML" seriously.

If you disagree please show me how many ML researchers/labs/companies use Julia over Python/C++. Its cool to claim "Modern", "Deep", "ML", but I don't see any evidence.

[1] http://tvmlang.org/2017/08/17/tvm-release-announcement.html



What's strange to me is that as people watch this transition in ML technologies, they can see and accept what's right in front of them but can't see what's coming next. There's growing evidence that that ML/AI is a programming language problem – that's why all kinds of new IRs are being created for ML frameworks, as you say. But these are never going to be good to program in directly, so the next obvious step is to put a really nice, general purpose surface syntax on them. Once you start thinking about what that surface syntax should be like, you quickly end up with something very similar to Julia. People don't seem to be connecting the dots just yet.

In some sense ML people are working from the bottom up while Julia is working from the top down. ML/AI frameworks are starting to put better IRs on top of their low-level codegen. They haven't gotten to the surface syntax part yet because they're just starting on the IR – but they will, because that's the next logical step. Julia, on the other hand, is working from the top down, starting from a really nice surface syntax that's excellent at codegen. First it has targeted CPUs, now GPUs, and in the future TPUs, Nervana chips, FPGAs, etc. It's already possible to target all kinds of different hardware with the same productive, generic high-level code. Which approach do you think is going to end up with a better, more productive developer experience in the long run?


Julia is growing a really good GPU support ecosystem.

Thus, the advantage is to have all your codebase in a single language vs a 2-language solution (e.g. Python && C++).


We live in an age of libraries like cudnn which are vendor provided/optimized, for 99.9% of ML applications you won't need to write custom CUDA code. Further as I mentioned earlier with Intermediate Representation languages like TVM you are guaranteed to get the best underlying constructs across different computing architectures.

Julia might still be good enough MATLAB replacement for Computational Simulation style tasks, but its clearly not suited for Machine learning.


Neither is Python, but people still use it.


Julia, despite a pretty small userbase, has some great ML libraries. See for example Knet:

https://github.com/denizyuret/Knet.jl


You are right, but for Python, you usually don't have to touch C++, because someone already took care of that part.


That's true when you're lucky. But if you're someone who does package development and methods research, you're usually not lucky. This is why pretty much all of the top / mainly used Python packages have lots of C++ and Fortran in there. As someone who develops algorithms rather than just using packages, I found this to infringe on my productivity in Python, whereas making Julia packages "production-quality" is straight forward, so even with the smaller userbase I think it's worthwhile to develop in Julia instead.


Julia marginally makes sense in some academic environments. Outside of that? Not really.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: