

Apple ditches NVIDIA, goes with ATI for their desktop lineup - primesuspect
http://tech.icrontic.com/news/apple-switches-to-ati-exclusively-for-imac/

======
Raphael_Amiard
" Apple has been going in the direction of ATI’s OpenCL acceleration
architecture, the competitor to NVIDIA’s CUDA."

OpenCL is _not_ ATI's technology. When i spot such an inacurracy in an
article, i stop reading it, since i have suddenly no more reason to believe
that the rest of it isn't just as inacurate

~~~
hakunin
That seems completely illogical since you cannot and will not verify most of
information, and since information does not come in any kinds of packets or
chunks of truthfulness. Why does an incorrect statement invalidate the whole
article? When you find that your dog has a flaw, you don't throw away the
whole dog now, do you?

~~~
parallax7d
When it comes to news sources, it's best to 'throw out the whole dog' when
they can't get the very basis of their usefulness correct - reporting facts.

~~~
hakunin
This is kind of like deliberate decision to miss the point based on a mistake
in an irrelevant point. Too personal to argue though.

~~~
Raphael_Amiard
No it's based on the number of informations you're able to verify from the
article, and the number of those which are false. Also this very piece of
information is not irrelevant, it is at the core of the subject at hand.

If you're able to verify a small number of "facts" presented in the article,
and a big proportion of those are false, the probability the entire article is
bull went from 'unknown' to 'very high', relative to your personal knowledge.

Eventually you could be wrong, but there is enough sources of information on
the internet to allow me to choose not to read those i think are bogus.

Even worse, the mistake i pointed out is a very simple one. It's almost
_impossible_ to think, from the information you find on the web that openCL is
ATI's equivalent to nvidia's proprietary CUDA technology. The mistake serves
the article well though, by antagonizing the two technologies, and making the
whole thing more sensational than it needs be. So this isn't a simple
"irrelevant" mistake like you said. It's bad journalism, plain and pure.

------
37prime
"ditching" is a strong word. Apple has always been rotating between GPU
vendor.

------
rtrunck
I'm not totally sure Apple has ditched NVIDIA for long - there are reports of
Fermi based graphics cards working. Apparently the GTX460, GTX470 and GTX480
all work.

[http://www.electronista.com/articles/10/07/03/imac.and.mac.p...](http://www.electronista.com/articles/10/07/03/imac.and.mac.pro.to.get.new.gpus.soon/)

I have also heard other reports of users verifying they work, for both real
Macs and hackintoshes.

------
chrisbennet
" Those who pay attention to details probably have noticed that since late
2009, Apple has been going in the direction of ATI’s OpenCL acceleration
architecture, the competitor to NVIDIA’s CUDA."

It isn't really "ATI's OpenCL", NVIDIA supports OpenCL also. According to
NVIDIA:

"NVIDIA has chaired the industry working group that defines the OpenCL
standard since its inception and shipped the world’s first conformant GPU
implementation for both Windows and Linux in June 2009."

------
pxlpshr
Well, I didn't have a good experience with NVIDIA even though it's been a
great brand to me when I used a PC many years ago. The NVIDIA 8800 in my Mac
Pro was a miserable failure, and neither NVIDIA or Apple would RMA it even
though I had on record multiple complaints about spotty behavior before it
completely failed.

I read lots of complaints on the Apple forums, and not just about this
particular card.

------
rbanffy
Am I the only one to notice the irony of an Intel CPU coupled to an AMD GPU?

edit: the downvotes show I am

------
keyle
I'm not sure Apple actually stops using NVIDIA at all, but that's great news.
Great news for Hackintosh'es.

------
stuff4ben
I wonder if this signals a possible switch to AMD CPU's in the future? Maybe
AMD offered them a great deal on graphics boards for enticing them to try
their CPUs. Of course this is just pure speculation.

------
Groxx
We'll get more OpenCL & OpenGL focus on ATI cards now :)

~~~
dedward
Does anything actually use OpenCL yet?

------
jlcgull
A sad day indeed, at least for me. I inevitably end up running some flavor of
linux on almost all my computers sooner or later. ATI's support for linux,
compared to nVidia, has been piss-poor to say the least (certainly in my
experience). I have always recommended and bought nVidia cards for this one
reason. In fact, we recently went with nVidia's QuadroFX 4800 cards for 5 new
workstations (requiring stereo vision on linux/macOS). Want to guess the
biggest factor why no one on the team even dared think ATI ? ... The (almost
always) nightmarish experiences with ATI's low-end 'consumer cards' on their
personal linux machines led to a lack of faith in any ATI product.

 _Please note, as far as linux support goes, I am talking about the "official
ATI closed source drivers"._

~~~
mdwrigh2
Unfortunately, this is exactly what I came in here to say. This truly is a sad
day for anyone that wants to use anything but OSX on their Mac.

(Written on a Macbook Pro running Ubuntu 10.04)

~~~
dman
As I said above the reality of current Ati drivers is much better than the
fiasco that existed four years ago. On ubuntu it just works.

~~~
mdwrigh2
Unfortunately, this is not my experience. Just this summer I installed Ubuntu
10.04 desktop on a computer running a relatively new ATI card (It was top of
the line a couple years ago, I believe). While in general use its fine, there
are plenty of really annoying issues. Unfortunately for us, as we were
attempting to use it as a media center, there were significant issues with
full screen video, where it would freeze up for 5 - 10 seconds at a time every
minute or two. This is a known issue for the entire series of cards, and is a
driver bug, but hasn't been fixed in the 1+ year that the issue has been open.
Then there are other smaller issues with the card such as a 5 second delay
maximizing a window, if you're using compiz. Not a deal killer, but still
annoying, and there were a few similar bugs like this. After this incredibly
sour experience with ATI, I'm not sure I'll recommend them again for a long
time. And I haven't even gotten into how long I had to fight with X just to
get dual monitors to work.

~~~
dman
At some arbitrary time in the past I did use the xorg server from this ppa
[https://launchpad.net/~ubuntu-x-swat/+archive/xserver-no-
bac...](https://launchpad.net/~ubuntu-x-swat/+archive/xserver-no-backfill)
when I was using compiz and things were slow on an ati card. I dont know if
that ppa is updated for lucid but you could give it a try. If youre using it
as a htpc you can disable compiz no ? I mean in most cases wouldnt you want to
directly boot into mythtv / xbmc ? Also can you specify what card are you
using ? If its x1xxx series or earlier then youre beyond the support window
(the closed source driver is only for newer cards) and you might be better off
using the radeonhd driver. I think I should do a write up on how to setup ati
drivers on linux the right way. It takes < 5 mins to get a dual head display
with auto detection going these days. HINT: aticonfig is your friend.

------
jc-denton
As a CUDA developer I'm pissed off by this decision.. Last time I checked
OpenCL wasn't ready (much slower then CUDA) so I don't see how one could use
the new Macs for serious GPU development :(

Also Adobes premiere uses CUDA.. maybe Apple wants to look their own product
(Final cut) better on Macs?

