This does strikes me as an interesting perspective though:
> a bedrock of optimized and domain-aware numerical libraries written in a combination of Fortran, C, C++, handwritten assembly and some HDLs
So, we're talking lower level, from the kernel and above but below TensorFlow and Keras etc. I take it?
If yes, we're into systems by that point; it just so happens that this is Go's domain. Think of shrinking a 1000-brains 10-year cycle to a 100-brains 5-year or less (you may actually DevOps/CI-CD and sprint that stuff with Go like you would with most high-level's like Python or Js; a general "no-can-do" with low-level concurrent C++).
I may be wrong or totally out of my depth here, but I'm speaking of the library layer where e.g. in compute you'd see a CUDA-based industry versus whatever else (OpenCL...) take about a decade to unfold (and as much to move back); same in graphics, you'd see DirectX / OpenGL/Vulkan market penetration play over the entire lifetime of a GPU architecture (not 'gen', the core design like e.g. GCN).
I'm lacking the experience with ML workflows for sure; however the general hardware cycles and industrial market profiling seem to hold there as well (from GPU / TPU / FPGA / whatever fabrication, and the economics thereof; up to high-level software libraries and 'engineering muscle' that proponents of tech XYZ can push beyond marketing).
Where Go fits in that landscape is not so much versus a data scientist's Python but rather against infrastructure / Ops people's C++. Like Go is currently a strong candidate to refactor (parts of) the COBOL space, because it's basically 10x faster than in C++ or Java.
I'm thinking out loud. Don't quote me on any of this!
Tensorflow/Keras are frameworks built on:
> a bedrock of optimized and domain-aware numerical libraries written in a combination of...
You are not making any calculations in Python (if you do you are using native numerical libraries via FFI mostly), you only setup your workflow, prepare data etc. Heavy lifting is done in low level libraries used by ML frameworks.
That's why Go will not make it drastically more performant/better. Additionally most people doing ML knows Python, almost none know Go. 99% of tutorials, ML libraries etc are in Python. Not enough benefits to leave such an rich ecosystem in favor of Go. I write a lot of Go but even I would prefer Python over Go for any ML work.
I can give you one reason: Most of the ML tutorials online work on toy problems. Full stop. When it comes time to deploy the models, good luck, you need to move heaven and earth with devops stuff. Dockerize all your programs, add more heft.
Not so when using Go directly. There's a reason why data engineers are more in demand now than data scientists. With Scikit learn, keras etc it's easy to build models. It's not easy to deploy models to production. Half the tutorials don't teach the important bits: that your model needs to live in production.
Now, if you write your programs using Gonum or Gorgonia, you need to think a lot deeper about what your model is doing, about memory about things that software engineers think about. It's not easier, but it's the only sustainable way forwards.
And most libraries and implementations are also in Python because they work on toy problems?
> There's a reason why data engineers are more in demand now than data scientists. With Scikit learn, keras etc it's easy to build models. It's not easy to deploy models to production. Half the tutorials don't teach the important bits: that your model needs to live in production.
There is also a reason why you employ full stack web developer instead of frontend developer. I can tell you what is the reason - it's cheaper than employing frontend developer and backend developer. And for toy problems you can hire fullstack developer.
> Now, if you write your programs using Gonum or Gorgonia, you need to think a lot deeper about what your model is doing, about memory about things that software engineers think about. It's not easier, but it's the only sustainable way forwards.
So you are implying that whole industry, researchers and ML practitioners got that wrong and they should use Go now?
I know a lot of people working on ML related problems and none of them use Go for their ML work. Some of them have Go in their stacks, sure, but it's not used for ML directly. And they solve practical business problems.
I've also worked before with two ML researchers respected in the industry and I can assure you, they are not working on toy problems and they do not know Go. And this is coming from Go user and enthusiast. Programming languages are just tools, not a religion.
Programming languages are tools indeed. Some tools make life easy in one way, some tools make life easy in other ways.
Exactly, you always choose between different set of trade-offs.
Machine learning code isn't going to run just anywhere.
And coming soon, gorgonia on TPUs. Originally planned for this year, it seems to be next year now owing to some TPU peoples' timetable change.
> The primary goal is to make calling the CUDA API as comfortable as calling Go functions or methods. Additional convenience functions and methods are also created in this package in the pursuit of that goal.
Great angle. Very idiomatic to Go, and ballpark what most developers wish. Polishing the convenience aspect (until some 'promise of stability' by 1.0) is truly what makes some projects popular imho. UX (well DX, for Developer eXperience) is a clear determining factor especially in the age of open-source and generalized `git` remote pushes to 'the cloud'.
Might make things a bit more clear on the reasons why certain things are done in a certain way. It's by no means THE only way, but I hope I have listed my reasons clearly.
Python is great anyway, a language's intrinsic quality has nothing to do with whatever else the programming scene does. In my view it takes many languages to make a better software world, and while we don't want 250 (...), a couple dozen on a normal distribution seems quite adequate.