Hacker News new | past | comments | ask | show | jobs | submit login
Intel Arc B580 Delivers Promising Linux GPU Compute Potential for Battlemage (phoronix.com)
36 points by zdw 11 days ago | hide | past | favorite | 23 comments





I would love some analysis of perf/watts.

The intel arc cards are NOT power efficient based on other reviews. Very similar total power draw for the B580 to the 4070 based on other reviews. The perf/watts should be damning but this review doesn’t state that.


Gamers Nexus did some power efficency numbers on this new card:

It was fine when playing games (windows), totally comparable to AMD/Nvidia,

https://youtu.be/JjdCkSsLYLk?si=pUDwsGkCuenr2Iol&t=1874

it did poorly at idle (30+ watts).


According to TechPowerUp, idle power is absolutely fine, but only if you enable ASPM: https://www.techpowerup.com/review/intel-arc-b580/38.html

I guess GN did not enable ASPM?


To be honest...it doesn't matter for most use cases. If the card can be powered with a single 8-pin connector, and has plenty of cooling for the heat it generates...who cares? If you have 1000 of them in a building then that's a different situation entirely.

Really, for most users, all that matters is price, performance and functionality.

With a CPU it matters a little more since you need to get cooling yourself.


This article is about compute where a lot of users are considering near 24/7 usage in some way or another so would be very helpful to have.

Heavy gaming users would probably hit the cost vs usage overlap of the extra power draw as well.

For other users Idle vs idle power draw is something and the intel cards have been very bad with this too so it would be nice to state it.

Lastly I use it as a proxy for architecture head room which is just something I like to get a feel for. I suspect Intel would need a completely new architecture to get into datacenters and this is good to know.


> Heavy gaming users would probably hit the cost vs usage overlap of the extra power draw as well.

They absolutely would not. Maybe if they’re paying California or Germany electric bills, and even that is still not a huge impact. For every other home desktop user it doesn’t matter.

That said, GamersNexus pointed out that these cards have somewhat poor idle power consumption.

The truth is Nvidia is somewhat untouchable on performance per watt right now so their competitors at some point have to throw power at it in order to make a marketable card.

Finally, it’s also notable that this Intel card performs better than the 4060 as resolutions get higher. So there may be some light gamers looking in the $250-300 price range who are playing a certain type of game where they want to be able to run high resolutions on a budget card where this SKU makes sense.

Playing a lower graphical intensity game on a high resolution display would be the sweet spot for this card it seems. The card might draw more power than a 4060 but it can outperform it in the right scenario and be acquired for a lower price.


>where a lot of users are considering near 24/7 usage in some way or another

Who are these "a lot of users" doing this? Are they "a lot" of the total gamers' market share this card is aimed at, or are they "a lot" in your reality/bubble?


Even gaming 8 hours a day, the difference between a B580 and a more efficient card multiplied by the cost of the electricity over ~3-4 years is...next to nothing.

I'm not saying Intel shouldn't improve that aspect, they should. But to me it's kinda silly to focus on.

As for using B580's for compute, I don't really see the appeal when so much more powerful options exist.


They made another review, but with games, in the morning [1], it is linked in the page. You can find plenty of data and charts about power consumption and yes, it seems they are NOT power efficient.

[1] https://www.phoronix.com/review/intel-arc-b580-graphics-linu...


Game power efficiency and GPU compute power efficiency aren't necessarily highly correlated. Game power efficiency is often achieved with highly tuned drivers, and of course Nvidia has a massive lead on Intel for that.

Perf/watts is a deal breaker when used to "hide" lackluster performance. Intel did this with the 11th and 13th CPU gen and it showed a lot.

I don't think it makes much sense here because the gap is not that large. The B580 consumes more power (let's say about 50W though it seems to be a smaller delta in a lot of cases) than a RTX 4060 but it's also faster and 50$ cheaper.

Even at a fairly high electricity rate of $0.32/kWh, you are looking at 20 hours of play to get 1kWh consumed. That means you need to game a whopping 3125 hours to get to breakeven.


B580 consumes quite a bit more when idle though (~35 vs ~10w) which can add up if you keep your PC on a lot of the time. IIRC even the 14900k is less than 20w when idle.

So not great.. but yeah, still would need to run 24x365 to get to $50.


I had missed the idle consumption. That being said I guess if they are that energy conscious the next question would be why aren't they shutting down their computer when not in use.

You shouldn't need to shut down your computer just to have low power consumption when you're not doing much. I just built a home server PC using an AMD Ryzen CPU and the entire machine consumes 24W, measured at the wall, when the hard drives are spun down. That's including the CPU/MB, memory, 4 case fans (at a lower rpm), and 1 NVMe drive, plus the CPU fan, the power supply fan, and all the losses from the power supply itself (power supplies aren't that efficient at idle loads). That's all less power than people here are saying this stupid GPU card consumes at idle (30W). It should not more than double my power consumption just to plug in a GPU card and have it do nothing.

>I just built a home server

People building home servers aren't a big customer base. In other words, you don't really matter to them.


No one's using an Intel ARC graphics card in an enterprise setting. If they don't understand this and are not worried about power consumption, this is just yet another reason Intel is swirling the drain.

Is this specific card really something you would ever use in an enterprise/datacentre setting? It's just not its target market. IIRC Gaudi is almost entirely unrelated to ARC

And even if we're talking about datacenter GPUs power usage should be a relatively much smaller issue than for CPUs because of how expensive they are e.g. depreciation for something like the H100 (assuming it actually starts losing value any time soon...) is more than the power + cooling costs anyway.


There is some perf-per-Watt data in there plus my other Battlemage Linux review out today. I'll also have more data in another article Friday~Monday.

If your doing 1080p, looks like a good price per fps. But at 1440p like other mid cards, its a "maybe".

Would be nice if 16gig mem was the default to give some future headroom and for people wanting to do AI.


16GB is reserved for the upcoming Intel Arc B770 GPU.

What would the 'yes' be for mid 1440p card?

Reminder: try and use the latest intel GPU drivers on Linux. These intel devices are very new and constantly get software updates for obvious visual artefacts in 3D workloads (mostly games). Also improving performance in many cases.

Cool, glad to see another player. If I was gaming still, i'd try one out.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: