Oh please. It's not about pixel resolution. It's about the amount of realism in each pixel. Pixar and FX houses render frames at the same 1080p resolutions - but they do them at over 2-10 hours PER FRAME. The GPU is getting better, but the artistic and realistic quality of images will probably always be in front of the technology.
You could make this same, lame argument about CPUs, but we've always used those limits too.
But GPU has nothing to do with those rendering techniques that you mentioned in Pixar case. You don't use OpenGL pipeline on a GPU card to render a Pixar movie, you did all tricks in non realtime offline rendering to make them beautiful.
To some extent, he's right that GPU processing power will get to a point where pumping more pixels will not be discernible to the human eye. But that doesn't mean that there aren't more transformative steps you can insert in rendering computer graphics to make it look more realistic. Convincing fluid and smoke dynamics come to mind.
The work on general-purpose computing with GPUs (e.g., protein folding) has only begun. This article is either trolling or shortsighted.
It's like looking at the web in 1995 and saying it's dying because there are only so many pages you can fit on a "what's new" page (at one point there were so few webpages that you could list all of them on one page).
GPUs are very exciting and a major reason why desktop computing will continue to matter outside of business applications.
The thing he doesn't get is that you can always have more smaller polygons, rendering a single 3D cube is trivial in any resolution/framerate you may want, the problem is that polygons interact/intersect and that takes some real computing power.
GPU's are getting more and more programmable, while CPU's are getting more and more cores and streaming instructions.
It's getting harder and harder to tell them apart and this means a typical PC more and more looks like two general purpose computers that are talking to each other over the PCI bus. Which is kinda dumb.
My eyes started glazing over when he started doing math. Did he calculate the amount of processing power for, say, a room sized, 6-sided monitor, or for a monitor with 1000s of layers in the Z-dimension? Because, it seems like those would take quite a bit of power...
You could make this same, lame argument about CPUs, but we've always used those limits too.