
Intel to Release Discrete Graphics Card in 2020 - modeless
https://www.tomshardware.com/news/intel-discrete-gpu-2020-raja,37289.html
======
siberianbear
I remember the first time Intel tried this, in 1998, when they introduced the
i740 AGP card. [1][2] The data sheet is still out there if you know where to
look. [3] It made everyone in the industry from S3, Matrox, ATI, Nvidia and
3dfx all tremble. In the end, it flopped.

[1]
[https://www.anandtech.com/show/202/3](https://www.anandtech.com/show/202/3)

[2]
[https://assets.hardwarezone.com/2009/reviews/video/v2740tv/v...](https://assets.hardwarezone.com/2009/reviews/video/v2740tv/v2740tv.htm)

[3]
[https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c...](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=6&ved=0ahUKEwj-4IXHyc_bAhXmNJoKHc92DUkQFghLMAU&url=http%3A%2F%2Fwww.vgamuseum.info%2Findex.php%2Fcpu%2Fitem%2Fdownload%2F924_eea0c29729fc2e16b2514e590d74bb72&usg=AOvVaw3jMEj2kfchzWH4YV0sR8Vo)

~~~
rangibaby
I had an i740 and it was OK. It ran games from the era without complaint:
[https://youtu.be/x8sZ3kazUaU](https://youtu.be/x8sZ3kazUaU)

------
userbinator
This was one of their previous attempts:
[https://en.wikipedia.org/wiki/Intel740](https://en.wikipedia.org/wiki/Intel740)

I hope these GPUs will be far better documented and have very good open-source
drivers --- Intel has historically been much more open than ATI/AMD or nVidia
at releasing public developer information.

~~~
sevensor
Wow, I was an intern at Intel when they were making the i740. I would run the
WHQL tests on sample boards / drivers and document which tests they failed.
Part of the job was testing them with preview builds of the upcoming NT5.0.
Tests involved playing a whole bunch of different games in 30 second
stretches, changing screen resolution, and repeating. The full-timers also
thought it was very important to stress test the hardware, which meant playing
Action Quake 2 over lunch every day. They were quite good.

------
aneutron
There was an article written by a guy from the Larrabee team, where he
explained how the projet wasn't a failure, in the sense that management wanted
a processing beast and that's what they got, not a GPGPU.

But what's interesting is that in the article, he said that the Intel GFX dep.
were "begging to let them build a GPGPU", and he might have been exagerating
but he made it seem like all they needed was to slap some ROPs and texture
units on that thing.

That makes me question wether we'll be seeing a brand new architecture, or an
Intel HD unleashed edition with some GDDR5.

~~~
nl
_There was an article written by a guy from the Larrabee team, where he
explained how the projet wasn 't a failure, in the sense that management
wanted a processing beast and that's what they got, not a GPGPU._

Great. Except the kind of parallel processing Larrabee was supposed to provide
isn't something anyone wanted.

It's a half-pregnant architecture. Not parallel enough to compete with GPUs,
not enough single thread performance to compete with Intel's own CPUs.

~~~
dTal
As someone who was doing enormous amounts of CPU-bound raytracing at the time,
I wanted it. But quite honestly the product sounds absolutely incredible even
without workloads like that - to read the article, it worked perfectly well as
a graphics card, with the added bonus that 1) the graphics code was in
software, meaning you could arbitrarily upgrade the graphics model to support
later software, and 2) the system _also_ ran general purpose loads,
simultaneously.

------
ksec
I am wondering if GPU design are basically approaching the end and limited by
smaller node and bandwidth. We don't see better performance per die /
transistors any more . And hasn't been so for many years. The gain seems to be
small.

Given Intel has had the lead in transistor density for years, and will
continue to do so for at least a few more years, I wonder why Intel hasn't
jump into this earlier.

The largest obstacle in GPU is actually the software and driver support.
Especially in gaming. Middleware Engines has made this 10x easier. but it is
still a fairly hard problem.

I wish they would call their new silicon as AGP; Advanced Graphics Processor.
:P

------
dawnerd
This will be great for everyone. About time theres some more competition.

------
petecox
Call me a conspiracist but Intel strategically are roadmapping an exit of the
IGP market? Perhaps a new focus on high performance OpenCL computing, if the
returns on low-power IGP are diminished, viz:

(1) The Vega partnership with AMD (2) thunderbolt eGPUs (3) Apple PowerVR-less
in A1x chips (4) Adreno on budget Win 10 devices.

Headless desktop motherboards, with Intel Inside laptops and NUCs utilizing
Vega graphics would be my prediction.

------
simcop2387
I really hope they don't do weird market segmentation stuff with their virtual
gpu support. They've got the best support for doing that with qemu/kvm
currently. Amds sr-iov support exists but the hardware support isn't very
widespread still. No idea about the Nvidia side of things.

------
mikece
Hedging bets against the rise of ARM processors?

------
crb002
I'm surprised they don't make lossy GPUs that have a few defects among the
obscene amount of transistors. Consumer graphics you wouldn't know the
difference.

~~~
exikyut
Interesting idea.

I assume you mean that each GPU gets a variable number of execution units, as
opposed to code randomly failing depending on the core it gets run on.

You've just helped me realize that AMD and Nvidia overproduce every GPU and
kill off the execution cores that don't work to reach the final fixed number
(and that all GPUs probably have a few tens to hundreds of working cores that
will never ever be used).

Unfortunately I don't think such variation would work, from a market
standpoint: actually exposing the lossiness rating would devalue you as a
manufacturer, solely because of the fact that the media would pounce on it
like crazy, and even though said media would simply report what's there
without putting malice in anyone's mouth, the fuss would mean naive users
would go "wat", collectively conclude "well the number is smaller", interpret
that to mean "vague less-iness", and bam, people don't think your product is
as good.

Despite the fact that, for all intents and purposes (read: not mining), you
are 100% right.

~~~
theandrewbailey
> You've just helped me realize that AMD and Nvidia overproduce every GPU and
> kill off the execution cores that don't work to reach the final fixed number
> (and that all GPUs probably have a few tens to hundreds of working cores
> that will never ever be used).

That's basically the idea, but it's usually scaled down. A single GPU
generation is essentially one chip design. If a single die has a defect in a
particular unit, its turned/soldered off at the factory and sold as a lower
end chip. Repeat until you have a chip that works. If you have a die where
everything works, clock it up as much as you can, and ask a king's ransom for
it.

The best exception is the PS3 "Cell" CPU.[0] It had an extra 'sacrificial'
processing core.

[0]
[https://en.wikipedia.org/wiki/Cell_(microprocessor)#Commerci...](https://en.wikipedia.org/wiki/Cell_\(microprocessor\)#Commercialization)

------
melling
They must want some of the bitcoin market.

~~~
deweller
Graphics cards aren't used to mine bitcoin any more. Only chips that are made
specifically for mining bitcoin (ASICs) are used in mining rigs.

~~~
melling
When did that change? I still see it mentioned.

[https://venturebeat.com/2018/02/11/nvidia-ceo-
cryptocurrency...](https://venturebeat.com/2018/02/11/nvidia-ceo-
cryptocurrency-mining-drove-a-spike-in-graphics-chip-sales/)

[https://www.techradar.com/news/best-mining-
gpu](https://www.techradar.com/news/best-mining-gpu)

~~~
nl
According to [1] if you have 15c/kWh electricity you can make about $1/day
with a 1080ti on _Bitcoin_. This of course doesn't cover the cost of the card
for over a year.

Other coins are slightly more profitable though.

[1] [https://crypt0.zone/calculator/s/most-profitable-coin-to-
min...](https://crypt0.zone/calculator/s/most-profitable-coin-to-mine-1080-ti-
at-this-moment)

~~~
lozaning
What these calculators never seem to account for is difficulty factor. That
dollar-a-day today rapidly approaches zero as more and more people bring more
equipment online. No way you're making a dollar a day 600 days from now.

~~~
nl
Yeah, I agree.

