Hacker News new | past | comments | ask | show | jobs | submit login
Intel A770, A750 review: We are this close to recommending these GPUs (arstechnica.com)
19 points by Tomte on Oct 5, 2022 | hide | past | favorite | 13 comments



I buy cards primarily on:

- Machine learning / GPGPU support

- Stability

- RAM

A 16GB card for $349 is a good deal. That's more than a 3080, and same as a higher-end 4080.

If Intel is ever able to ship stable open-source drivers supporting pytorch, tensorflow, numpy, etc., this will be a game changer.

What I really hope Intel or AMD decide to ship is a card designed to disrupt the higher-end NVidia products here: something with e.g. 64GB RAM for $500-$1000. That would bring voice recognition, image generation, speech synthesis, etc. to a much broader audience. Or better yet, some hybrid model which can make use of system RAM as well, so the memory is upgradeable.

Yes, I know all the points about performance, things being IO-bound, and whatnot. I'll take it. For gaming, they could also include a smaller amount of very fast RAM.


Yup. Made a massive mistake getting an AMD graphics card last time around. Has never worked for any ML workload ever.


A 16GB card for $349 is a good deal. That's more than a 3080, and same as a higher-end 4080.

What do you mean? 16GB 4080 is $899.


The parent comment is laser-focused on the VRAM capacity of the cards. They are saying 16GB is more VRAM than you find on the 10/12 GB RTX 3080, and the same as the 16GB RTX 4080.


Oh I see. My guess is it will be competing with 4060 cards as far as gamers are concerned.


At this point, machine learning has seeped into a lot of my workflows. What I can do and how well is capped by video memory. This is useful enough that it's just a matter of time before this makes it into mainstream tools. There are GPGPU things other than machine learning which will likely follow.

Right now, I have a card with 16GB, which cost a bit over a grand. This means it can do Stable Diffusion at 512x512, but not higher resolutions. It can do some NLP models, but not GPT-3 scale ones. It can do awesome speech recognition.

These things are useful enough that I claim more people will want to use them as soon as they don't require knowing Python. I think it's just a matter of time before enterprising folks will write code to make some of these workflows available to everyone. I don't think it's unlikely that GPGPU will be standard at least in business laptops in the near future, for the productivity increase.

If Intel can do this for 1/2-1/3 the cost of NVidia, the impact will be huge. If Intel can democratize truly large models, like GPT-3, the impact will be even greater.

If a model takes 20 seconds instead of 10, that's not a huge difference. If it runs out of VRAM, it is.

business market >>> data center market > gamer market

This kind of stuff was Intel's sweet spot maybe two decades ago. They shipped high-quality numerical code, stable drivers, a lot of open-source, and robust integration with operating systems.


I’m curious how well Linux support is. For networking cards Intel generally does a great job with Unix and Linux support. Everyone strongly recommends their NIC’s for servers and workstations.

AMD support is getting better on Linux, especially over the last 5 years. But if Intel is similar enough, they could win the small Linux market by having few to zero driver issues.


https://www.phoronix.com/review/intel-a750-a770-arc-linux

https://www.phoronix.com/review/intel-arc-graphics-linux

TL;DR: quite good, open drivers, but not yet in distro releases. Some reports power management is still broken, leading to very high idle power usage.


Has anyone read about GVT-g support (or something similar that allows attaching the GPU to multiple VMs) is supported by these cards? Last time I checked Nvidia & AMD only support this on their enterprise / datacenter cards and it would be cool if Intel would be more open.


They are good. They are not amazing. Radeon RX 6650 XT at $300 kind of spoils it for Intel. But they should help keep AMD pricing in check.


Intel driver support for their onboard video for games has been historically very bad


ray tracing is a game changer for me


I see what you did there!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: