Doubtful. Specialized graphics processor combined with a weak CPU makes sense in a console where graphics generation and display is core task.
In a server context, GPUs don't make sense. They won't make your web server or a database or Ruby any faster. In fact, in a server a GPU could be stripped off completely to save cost and power without impacting the performance.
Programming a GPU or other specialized processors requires custom code. You can't just recompile your database written in C or Java to magically take advantage of them. And since people can't afford to rewrite the whole software stack to take advantage of such capabilities just to serve web pages, it's not going to happen.
As the article says, the main reasons those chips might get traction is low power (and possibly low cost) since the cost of power is very important at scale and ARM is better at power management than Intel.
That's a key issue; strong GPU + weak CPU is great for just about every market other than general-purpose servers. So is it worth it for Nvidia to build a separate ARM-only chip just for the general-purpose server market? (especially considering that Marvell and Calxeda are already targeting that market)
I agree that they're not targeting the Linux VPS/Big Oracle Machine market; this design doesn't make any sense. The power they're saving on ARM MPU they're giving right back with multicore nVidia GPU. There are significant HPC applications that leverage CUDA that would run very nicely on this design; and you can see a nice dual/quad core low power ARM MPU with two or three low-powered GPU units being an attractive option for tablet/phone manufacturers, as well.
In a server context, GPUs don't make sense. They won't make your web server or a database or Ruby any faster. In fact, in a server a GPU could be stripped off completely to save cost and power without impacting the performance.
Programming a GPU or other specialized processors requires custom code. You can't just recompile your database written in C or Java to magically take advantage of them. And since people can't afford to rewrite the whole software stack to take advantage of such capabilities just to serve web pages, it's not going to happen.
As the article says, the main reasons those chips might get traction is low power (and possibly low cost) since the cost of power is very important at scale and ARM is better at power management than Intel.