I love PyTorch for tinkering and experimenting.
In my experience, there's very little 'impedance mismatch' with PyTorch, meaning the framework rarely gets in my way. I never find myself 'wrestling' with the API. I expect this is only going to get better now that one of the project's explicit goals is to match numpy's API and semantics as much as possible over time.
Congratulations to the PyTorch community. You guys have done a great job!
> x = torch.rand(4, 3)
The other frameworks are using (in my opinion) slightly misleading language as to how "ready" it is. onnx to Caffe2 is facebook's use case so I could see that being supported fine. It's hard to go much beyond that though.
Support for onnx will be bottlenecked by what pytorch can export right now. The file format just hit 1.0, but it will take some time for the ecosystem around it (including the export) to mature.
Disclaimer: I am a framework vendor
who has spent the last few months messing with it for end users writing model import
for tensorflow's file format as well.
ONNX to Caffe2*. Though it's possible to go from Caffe2 to ONNX to some custom hardware accelerator (TensorRT?)
And it will be possible but again support isn't actually implemented by anyone yet.
They just "signed on".
Tensorflow is the only
supported format right now for tensorrt.
May be I will find some use for it in my sysadmin world.
Thanks for such a summary.
Seriously though, we're in the early days of ML and still understanding the correct patterns and abstractions.
Web dev been through this to the point we had "JS fatigue" until things started settling on the reactive model and React / Vue / Angular 2
In the early days of "differentiable programming" perhaps. Certainly not ML.
Attention is an exceedingly precious commodity, potentially the most precious and valuable commodity in the marketplace. Skilled developers must learn how to direct this so that it stops being commandeered by posers all the time.
For embedded devices you may not have access to Python. You can precompile the graph to the target device in such cases.
Note that in practice, PyTorch is as fast or faster than Tensorflow and, and newer version allow you to "post compute" the graph and export to ONNX to allow embedded inference using Caffe2
A relevant essay: "On Machine Learning and Programming Languages"
In the maximally abstract sense, Python isn't necessarily the best choice for this, but as the closest manifestation to the correct way to do this that I know of is probably Haskell, that seems unlikely to beat out Python any time soon. There's certainly a lot of worse languages.
Ultimately, you'll want a language that has the necessary primitives built in natively, but it doesn't sound like we're to that level of maturity in the field yet.
Also Chainer is bogged down by the fact it is made and maintained by Japanese company/people or at least majority of them are Japanese.
Most of the community itself are Japanese users, and many model implementations are made by Japanese users, hence in blogged, documented in Japanese.
Chainer had similar fate like Ruby. Language barrier is real.
Can you give an example? What's not smooth?
I've been using Pytorch for quite a while, and had to use tensorflow recently. May be its my pytorch _priors_, but using tensorflow felt weird, and un-intuitive. Any good resources for pytorch folks to get into tensorflow?
But congrats on the hard work! I've been planning to try it out for a while now and the amount of resources and docs is great.
- 5.1 vs 5.2 vs LuaJIT. This is worse than Python 2 vs 3
- Lack of universal standard on how to OOP, etc. Python has PEP8.
I can cope with index-1 counting (think like using Matlab), but other problems are holding me back.
The changes between Lua versions are nothing like the abysmal difference between python 2 and python 3. Not only 5.1 code is very likely to run on 5.3 and JIT without issues, if there are ever any issues they are always very minor and solvable with requiring a simple compat library.
P.S. I did not downvote you. Somebody else did
luarocks install torch
Error: No results matching query were found.
/opt/torch/extra/cutorch/lib/THC/generic/THCTensorMath.cu(393): error: more than one operator "==" matches these operands:
PyTorch helpfully provides clear installation instructions for each platform and package manager at http://pytorch.org/ and the team have consistently been careful to ensure they work simply.
for example, python 2.7/ pip/ osx/
pip install http://download.pytorch.org/whl/torch-0.3.0.post4-cp27-none-...
pip install torchvision