16GB for $330. The price is about the same as an Arc A770 16GB, and about 66% of a 4060 Ti 16GB.
That's potentially compelling.
I'd love to see benchmarks and compatibility for PyTorch, llama, Stable Diffusion, etc. I assume it's still a train wreck, but if AMD can get it's software stack in shape, that'd be very, very competitive.
I played a lot of War Thunder on Devuan (Debian minus systemd) through Steam.
Up till last year, I owned an nVidia 1080Ti. Performance in Linux was quite good, but I had crash after crash in War Thunder - Gaijin blamed nVidia for the bug, nVidia blamed Gaijin. End result was it remains crashy. Closed-source, so nobody who really cares can look. The crashes were so frustrating I decided to go AMD. nVidia driver updates are sometimes a crapshoot, most times it works fine, but every now and again the update results in a broken Xorg which is a PITA to fix, heaven help you if you get a mix of the nouveau and the official drivers.
Last year, I bought an AMD 6700XT as the prices dropped. I chose the 6700XT as it seemed like the best performance at the lowest power/cooling requirement.
Gaming has been flawless with just the in-kernel drivers using RADV. No addon software stack required, "it just works". I play a lot of games using Proton/Steam, so if issues were going to pop up, it would be there. I had some initial confusion as I thought I'd need to download the drivers from AMD, but as it turns out everything works great in kernel 6.x automatically provided you have the non-free firmware packages installed.
I'm not running Wayland, so I can't answer questions on that. Also, you have the 7900, and I imagine the drivers for that are pretty new. You'll want a fairly recent kernel (6.1+) You might want to look for updated AMD firmware on kernel.org - do a "grep -i firmware /var/log/messages" or syslog or whatever Fedora is using now. The output of "dmesg" probably also contains some clues. feel free to post back here if you find stuff.
It doesn't mention (or I missed it), but I wonder what the idle power usage is on these. I know with Nvidia cards you can run nvidia-persistenced to keep it in a lower power state on idle, not sure what to do with AMD cards (if anything is needed).
Allegedly, it depends if the card can clock down the memory. My 7900 XTX still goes to ~100W when I have two 1440p monitors at high refresh rate, purely based on how many pixels it pushes.
Assuming AMD's $330 MSRP holds, it seems like a pretty weak value proposition. Remaining stock of the 6700XT can be found in the $330-$350 range, and depending on the benchmark, it equals or beats the 7600XT handily.
I'm just glad AMD seems to have sorted most of the issues with RDNA3 on linux. It's now worth it to consider the 7800 XT as a good upgrade path, if the price is right.
Wendell from level1techs/level1linux has shown the 7600XT as a good cheap AI card due to it's higher VRAM compared to other similar cards.
Interesting - I bought an RTX 3060 Ti last year because I didn't really know what information to trust w/rt what makes a "good budget AI card" vs a "bad budget AI card." As much as I'm feeling the limited RAM (really should have gotten the 12GB 3060), I just wouldn't even know where to start when setting up an AMD card for GPGPU tasks. It feels like every idiot-friendly guide for home AI tinkering assumes you have an Nvidia GPU.
For a 6700, or even a 5700, the 7600 doesn't seem like much of an upgrade. You'd need to get a 7700 or above to make it worthwhile, probably. Or just wait for the next gen. Which is likely to be my own choice.
That's potentially compelling.
I'd love to see benchmarks and compatibility for PyTorch, llama, Stable Diffusion, etc. I assume it's still a train wreck, but if AMD can get it's software stack in shape, that'd be very, very competitive.