
LLVM Adds Support for Nvidia GPUs - tambourine_man
http://www.hpcwire.com/hpcwire/2012-05-09/open_source_compiler_adds_support_for_nvidia_gpus.html
======
wtallis
There's also been an LLVM backend for AMD's GPUs for a while (though it
doesn't support the newest Radeon HD 7000 series architecture):
[http://www.phoronix.com/scan.php?page=news_item&px=MTAyN...](http://www.phoronix.com/scan.php?page=news_item&px=MTAyNTg)

~~~
c0n5pir4cy
Slightly more up to date article on an OpenCL back end for LLVM:
[http://www.phoronix.com/scan.php?page=news_item&px=MTA2N...](http://www.phoronix.com/scan.php?page=news_item&px=MTA2NzM)

------
octotoad
From the article: "NVIDIA awakened the world to computer graphics when it
invented the GPU in 1999."

Umm....

~~~
DeepDuh
3dfx: ookay _sadface_

~~~
pmjordan
I'm guessing they're not counting 3D chips without hardware transform &
lighting support (where the 1999 GeForce 256 was the first) as "GPUs". Also,
3Dfx got taken over by nVidia soon after, so I guess it's not completely
wrong. Also... press releases aren't exactly the best place to look for
unbiased information.

~~~
blt
Yeah, I was a Gamer back then and the Voodoo series were called "3D
Accelerators". NVidia coined the term "GPU" for the first GeForce.

------
codedivine
This was already discussed at length on HN on previous posts such as
<http://news.ycombinator.com/item?id=3949899> (77 points at the time of
writing, posted 4 days ago).

~~~
jimktrains2
The front page is constantly changing, I didn't see it 4 days ago, and
apparently a lot of other people didn't either. If you don't like it, just
ignore it.

------
blt
This is awesome, but I don't quite understand how it works. What kind of code
will you write to invoke massively parallel behavior in LLVM? Will every
language have `kernel<<<...>>>`?

------
Symmetry
Nice, but I've always though it would be nice to have a slightly higher level
virtual machine that had more explicit mechanisms for expressing parrallelism.

