Hacker News new | past | comments | ask | show | jobs | submit login

installed standard Ubuntu 14.04 w/ Caffe, Torch, Theano, BIDMach, cuDNN v2, and CUDA 7.0

whoa - are you telling me that the nVidia drivers on Linux are so stable that they are building a commercial deep learning system on top of that. Is this the same thing as normal graphics drivers ?




The difference is between "given this particular hardware and OS setup, the driver will work correctly, guaranteed." vs. "on your (discontinued) Sony laptop with a strange hardware interface running the beta Slackware release the driver will probably work"


It's also the difference between "using our tools implementing our API on our hardware" vs. "trying to figure out the right thing when every component has a slightly different take on the API spec and the applications using the API make mistakes that we have to try to correct for with unreliable heuristics". Developers in the scientific computing world care about correctness a lot more than game developers working under unrealistic deadlines and with no commitment to long-term maintenance.


that doesnt make sense to me. If the driver works and does not crash the operating system (kernel panic FTW), that's good enough at this point.

nVidia uses CUDA or OpenGL - so its not quite the question of proprietary API.

At this point, I'm not worried about "framerate on my linux box isnt as good as windows".. its more "it works...".


An OpenGL driver capable of running most commercial games is horrifically more complex than a CUDA driver. An OpenGL driver that merely works according to spec is useless in practice. To achieve any practicality for non-trivial use cases an OpenGL driver has to take an attitude of "do what I mean, not what I say", much like web browsers and Windows' backwards compatibility. CUDA doesn't have those problems. NVidia never has to deal with developers complaining that their broken code worked fine on some other vendor's platform. They don't have to worry about programs relying on some esoteric decades-old feature that NVidia doesn't care about but had to implement anyways for standards compliance. And since CUDA is operating in the professional segment of the market, they can take their time when it comes to compatibility with bleeding-edge versions of other OS components.


Also compute API makes a lot more sense than OpenGL, with an understandable mapping to actual GPU resources.


AFAIK Linux with Nvidia GPUs is used in plenty of supercomputers and Hollywood special effects companies, so the drivers must be stable enough.


In Nvidia's deep learning lab last week [0] they described setting up on Ubuntu as "the easy way" and Windows the "not so easy way" [1], adding that their developers are on Linux so no-one has really tested it on Windows.

[0] https://developer.nvidia.com/deep-learning-courses

[1] http://on-demand.gputechconf.com/gtc/2015/webinar/deep-learn...


Nvidias drivers for Linux are pretty darn good these days..


Those for laptops are pretty shitty, we are still using third party programs to support optimus technology (with not great results).


I just wasted two days trying to get bumblebee to work on Debian with my Optimus laptop. :( Ultimately had to switch to Ubuntu and the Nvidia drivers (which apparently default to 'Prime' there).


Optimus technology is a fairly stupid idea in the first place though that exists for business reasons rather than technical.


Up to this day, it's not really possible to a laptop with Linux to drive a 4K external display without too much inconvenience. Even Broadwell has problems with driving 4K with SST @ 60 Hz, so Optimus is useful.


What's stupid about it? Sounds pretty smart to not have to use the powerful GPU if I'm only browsing the web, to me.


Intel's got the good fabs, but they can't get along well enough with the folks who know how to make a good GPU, so it's impossible to get an affordable desktop or mobile system at any price that has a decent GPU without actually paying for two entirely different GPUs.


"The powerful GPU."

You've been sold the idea that a "powerful GPU" needs to suck a lot of power all the time.

There is no real reason a "powerful GPU" shouldn't be able to scale its power usage way down when doing something simple like browsing the web. The only reason NVidia weren't able to do the "low power" thing on these systems is they weren't able to be the ones putting their GPUs on the same die as the CPU like Intel (& AMD) were. But of course they still wanted part of the action, so people ended up being sold this massive engineering bodge and told it's a good thing.


I have a laptop that has an intel cpu, and a powerful discrete amd gpu. Even though on-die gpus have gotten better, there is still a considerable difference in performance between on-die and dedicated. AMD and NVidia both realize that there is an onboard chip that can do the common stuff, and so they have made the ability to turn the dedicated off when needed.


If you imagine a properly, holistically-designed product, which wasn't full of chips from different warring companies, the high power GPU could be used to augment the power of the on-die GPU, instead of having to turn it off and deal with a whole bunch of mad signal-switching issues. AMD products can do this to an extent with crossfire, but generally, this is a world that we don't live in.


laptop or desktop ? because laptop drivers for Ubuntu 14.04 are still not that good. I'm on a Thinkpad T430S with nvidia (I disable optimus in the bios).


for CUDA only.

and keep in mind, it is a binary blob. proprietary to the core. in fact, CUDA the protocol itself is kinda proprietary.

if you base your solution on OpenCL, you can use all the vendors. nvidia, amd, imb, intel, altera, etc...

but NVIDIA spends billions on marketing to convince you that only CUDA matters. the same way sony spent millions to convince you that only laser disc^H^H^H^H^H^H mini disc ^H^H^H^H^H memory stick ^H^H^H^H^H blue ray, matter.


the nVidia driver is the most stable and and feature-complete driver you can get on Linux.


Yep. Godsend in comparison to AMD drivers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: