
Ask HN: Anyone have experience with Radeon deep learning? - alando46
https:&#x2F;&#x2F;instinct.radeon.com&#x2F;en&#x2F;6-deep-learning-projects-amd-radeon-instinct&#x2F;<p>https:&#x2F;&#x2F;rocm.github.io&#x2F;<p>Despite the ridiculous price point, I enjoy the portability and battery life of MacBooks. Almost all of my ML projects are on AWS or Linode or something, but if I were to even consider forking over for a MacBook Pro w&#x2F; a GPU, some sort of GPU ML support would be a must.<p>Anyone have experience with Radeon deep learning libraries? Thoughts?<p>(not trying to have a debate about the obvious impracticalities of mac laptops – just curious if anyone has done ML w&#x2F; Radeon libraries)
======
wizzerking
Maybe you found these StackOverflow Threads and Websites already
[https://stackoverflow.com/questions/37892784/using-keras-
ten...](https://stackoverflow.com/questions/37892784/using-keras-tensorflow-
with-amd-gpu/40814520)

[https://en.wikipedia.org/wiki/Comparison_of_deep_learning_so...](https://en.wikipedia.org/wiki/Comparison_of_deep_learning_software)

[https://gpuopen.com/compute-product/miopen/](https://gpuopen.com/compute-
product/miopen/)

From reading these articles the consensus is that if need speed now and can't
wait use an NVIDIA while AMD plays catchup. Who knows how long this situation
will last.

