
Intel’s “Iris” wants to change how you feel about integrated graphics - pedrocr
http://arstechnica.com/gadgets/2013/05/intels-iris-wants-to-change-how-you-feel-about-integrated-graphics/
======
jws
Support OpenCL on something other than Windows and we can talk about how I
feel. For now I feel I've wasted a lot of money on expensive silicon die area
that I can't put to use.

Edit: I mean OpenCL that uses the GPU. The last I looked Intel had OpenCL for
Linux, but it only used the CPU. Its a big world, someone may even have a use
for that.

~~~
pavanky
Good news friend. This[1] was announced a few weeks ago.

[1] <http://cgit.freedesktop.org/beignet/tree/README.md>

~~~
jws
_…you may expect serious problems and GPU hangs._

 _Only IVB is supported right now. Actually, the code was only run on IVB GT2.
You may expect some issues with IVB GT1._

 _The run-time is far from being complete._

In the future there may be good news.

------
rjknight
Anyone else think that "Iris" is a fairly obvious reference to "retina"
displays, indicating that Intel expects devices which come with Haswell/Iris
to include retina-standard displays?

(It has been years since I've commented on a hardware thread, I had forgotten
how much fun this kind of codename Kremlinology can be)

~~~
ancarda
We can assume the new MacBooks will come with Haswell. Perhaps the Iris
graphics will be driving the Retina display? Right now the Retina MacBook
comes with Ivy Bridge and a separate GPU although I don't know if it uses
integrated graphics to drive the display.

~~~
reaperhulk
rMBPs have switchable graphics and will use both the IVB GPU and the nVidia
GPU to drive the display. The IVB GPU is noticeably choppy on some types of
scrolling when you've got it set to 1680x1050 equivalent or higher, so the
Haswell improvements will be welcome for newer buyers.

~~~
untog
The 13" doesn't have switchable graphics though, right? That's what has
stopped me from upgrading so far. Hoping that June brings good news.

~~~
reaperhulk
Right, 15" only.

------
Avshalom
I guess we can't expect a benchmarking yet but it feels like an intentional
oversight to mention AMD once and only in context of standalone graphics
chips.

------
lcentdx
According to Anandtech, performance is on par with GTX 650m which is really
impressive considering intel's past stance on integrated graphic. AMD APU's
graphic advantage's been eaten by intel bit by bit, not a good news.

~~~
JanneVee
You saw that AMD announced hUMA? So AMD APU's will have the advantage again if
they land about the same time.

------
ck2
Well for a start, Intel would have to make an IGP that supports dual-link
(resolutions over 2048x1536)

Because as of right now, none of them can

~~~
drivebyacct2
What is IGP?

For example HD4000 can drive a 2550x1440 display, but it doesn't work over
DVI, only DisplayPort on the integrated mobos. I've never understood if that's
related to a mobo limitation (that is just present on all Sandy Bridge mobos)
or if the CPU can't pinout to Dual Link DVI but can to DisplayPort?

~~~
nossralf
The graphics output from the CPU is sent over Intel's own FDI (Flexible
Display Interface), so the limitation lies in the PCH (chipset) that converts
FDI to the various display interfaces.

I looked at one datasheet [0] for the 7-series chipsets and it didn't mention
Dual Link DVI at all (although DVI was obviously mentioned). It did say
"DisplayPort/HDMI/DVI 2560x1600 at 60Hz" in one of the tables, but your guess
is as good as mine if that implies support for Dual Link DVI or not.

Edit: added link to referenced datasheet. See chapter 5.28.

[0]
[http://www.intel.com/content/dam/www/public/us/en/documents/...](http://www.intel.com/content/dam/www/public/us/en/documents/datasheets/7-series-
chipset-pch-datasheet.pdf)

------
sn0v
I'm presuming this will lead to cheaper laptops :) not to mention higher res
panels ably powered by the on board GPU

~~~
CognitiveLens
This announcement doesn't really seem targeted toward cheaper laptops - the
parts themselves will likely cost just as much as their previous-generation
counterparts, with the addition of a few "premium" options for people who want
more than the standard internal graphics but less than a discrete GPU.

The market fragmentation issue is important, and unfortunately it's not clear
that there is a fragment that is specifically targeted toward the budget end
of the market.

~~~
zokier
I think it's fair to assume that perf/$ will increase, and that will lead to
cheaper devices with acceptable perf.

------
lucian1900
It's AMD's fusion chips that have changed my feelings about integrated
graphics. Intel GPUs? Not so much.

------
bane
I have an cheap Nvidia graphics card from 2009.

2009.

I haven't met anything it can't run even if it isn't the fastest.

Intel's integrated garbage can't keep up. And it's not just the speed either,
crap drivers, screen artifacts, and in 3d (what it's for), texture issues,
lighting issues, shader problems, etc.

Without even knowing what's under the hood of a particular machine, you can
almost _feel_ the crappiness of the intel integrated graphics chip,
haphazardly trying to present a workable frame of some game you're playing.

I have no faith that these parts will match up.

 _edit_ here's a demo <https://www.youtube.com/watch?v=Otcge1cn8Os>

~~~
gcr
It's all fun in games until the nvidia's binary blob locks up. This happens to
me monthly on linux. In terms of stability, Intel's much better than nvidia
based on what I've seen.

~~~
lucian1900
On Linux. On anything else, Nvidia have the best drivers.

~~~
lmm
Even on linux, I've found nvidia's drivers the best by far.

~~~
lucian1900
It seemed to me that the open source ones tend to be the most stable,
regardless of vendor.

~~~
lmm
Shrug; again, very much not my experience over the course of several nvidia
(and a couple of ati/amd and intel) cards.

------
bch
Publish an implementable spec for high-performance opensource drivers and I'll
change how I feel.

------
kalmar
As they have been for most of the last decade. Here's hoping this time they
do. :)

~~~
Ergomane
They already came a long way. I recently got an i5 with a HD4000 and was blown
away by the fact that it runs Mass Effect 3 smoothly at 1600 x 900. Having
good hopes for the Haswells.

------
dharma1
looking at the power consumption the performance is incredibly good.

Nvidia should take a leaf from their book, then again I guess Inte's got a
power consumption edge on nvidia in terms of die size

Can't wait to get a Haswell macbook!

I wonder how this will filter down to Intel based mobile phones?

------
kloppi
Almost surely still going to be utter shit compared to a decent GPU like a
Radeon HD 7970 or GeForce GTX 680.

~~~
lobster45
Have you seen the size of those cards? I have a GTX 670 and it is a foot long
and uses close to 200 watts of power. You cant compare high end graphics cards
running on that much power and size which are costing over $400 for an
integrated GPU

~~~
qdog
Probably the GTX650 and HD 7750 would be better to look at, which are now
under $100 and don't pull that much power.

Whenever I look at them and they don't have enough discrete memory, I just
wonder if they can really run fast enough due to that single bottleneck. If
the onchip graphics had 512m or 1G of memory, they'd probably perform well,
but then you lose the power efficiency. I read something that intel was
possibly doing some new tech for onchip memory to get lower power, so it's
possible it'll happen.

It's probably still in the future, though.

