I hope these GPUs will be far better documented and have very good open-source drivers --- Intel has historically been much more open than ATI/AMD or nVidia at releasing public developer information.
But what's interesting is that in the article, he said that the Intel GFX dep. were "begging to let them build a GPGPU", and he might have been exagerating but he made it seem like all they needed was to slap some ROPs and texture units on that thing.
That makes me question wether we'll be seeing a brand new architecture, or an Intel HD unleashed edition with some GDDR5.
While it's focused on ISPC (https://ispc.github.io) it does go into a lot of detail about what happened at Intel during the Larrabee & post Larrabee days.
Great. Except the kind of parallel processing Larrabee was supposed to provide isn't something anyone wanted.
It's a half-pregnant architecture. Not parallel enough to compete with GPUs, not enough single thread performance to compete with Intel's own CPUs.
Given Intel has had the lead in transistor density for years, and will continue to do so for at least a few more years, I wonder why Intel hasn't jump into this earlier.
The largest obstacle in GPU is actually the software and driver support. Especially in gaming. Middleware Engines has made this 10x easier. but it is still a fairly hard problem.
I wish they would call their new silicon as AGP; Advanced Graphics Processor. :P
(1) The Vega partnership with AMD (2) thunderbolt eGPUs (3) Apple PowerVR-less in A1x chips (4) Adreno on budget Win 10 devices.
Headless desktop motherboards, with Intel Inside laptops and NUCs utilizing Vega graphics would be my prediction.
I assume you mean that each GPU gets a variable number of execution units, as opposed to code randomly failing depending on the core it gets run on.
You've just helped me realize that AMD and Nvidia overproduce every GPU and kill off the execution cores that don't work to reach the final fixed number (and that all GPUs probably have a few tens to hundreds of working cores that will never ever be used).
Unfortunately I don't think such variation would work, from a market standpoint: actually exposing the lossiness rating would devalue you as a manufacturer, solely because of the fact that the media would pounce on it like crazy, and even though said media would simply report what's there without putting malice in anyone's mouth, the fuss would mean naive users would go "wat", collectively conclude "well the number is smaller", interpret that to mean "vague less-iness", and bam, people don't think your product is as good.
Despite the fact that, for all intents and purposes (read: not mining), you are 100% right.
That's basically the idea, but it's usually scaled down. A single GPU generation is essentially one chip design. If a single die has a defect in a particular unit, its turned/soldered off at the factory and sold as a lower end chip. Repeat until you have a chip that works. If you have a die where everything works, clock it up as much as you can, and ask a king's ransom for it.
The best exception is the PS3 "Cell" CPU. It had an extra 'sacrificial' processing core.
Whats your line of thinking here? Are you saying that all crypto currency wont exist by then or just BTC?
Why 2020 as the time frame? What is going to happen between now and then thats going to make it so that it wont exist?
Can you qualify what you mean by wont exist? All the code is open source so I've got copies saved locally, and I've also got enough hardware to run the the whole network myself. I could, if need be, single handedly restart/run BTC.
It's my opinion that the only people more ridiculous than those that insist Bitcoin will replace all other fiat currency in the future are those that continually insist on its constant demise and absolute lack of value.
Other coins are slightly more profitable though.
The impression I get is that the only people taking cryptocurrencies seriously are managlement types who say "blockchain" in their sleep without having a clue what it means, obscure types with a few million invested, and the black market weaving in and out of the shadows.
It isn't economic (in most places, unless you have extremely cheap electricity and GPU cards) to mine bitcoin on GPUs.
It can be economic to mine other crypto currencies.
However, they are used to mine other crypto-currencies, which use different algorithms for which there are no ASICs.