Hacker News new | past | comments | ask | show | jobs | submit login
AMD Radeon RX 7600 XT Linux Performance (phoronix.com)
35 points by jlpcsl 12 months ago | hide | past | favorite | 18 comments



16GB for $330. The price is about the same as an Arc A770 16GB, and about 66% of a 4060 Ti 16GB.

That's potentially compelling.

I'd love to see benchmarks and compatibility for PyTorch, llama, Stable Diffusion, etc. I assume it's still a train wreck, but if AMD can get it's software stack in shape, that'd be very, very competitive.


a limited anecdote for you:

I played a lot of War Thunder on Devuan (Debian minus systemd) through Steam.

Up till last year, I owned an nVidia 1080Ti. Performance in Linux was quite good, but I had crash after crash in War Thunder - Gaijin blamed nVidia for the bug, nVidia blamed Gaijin. End result was it remains crashy. Closed-source, so nobody who really cares can look. The crashes were so frustrating I decided to go AMD. nVidia driver updates are sometimes a crapshoot, most times it works fine, but every now and again the update results in a broken Xorg which is a PITA to fix, heaven help you if you get a mix of the nouveau and the official drivers.

Last year, I bought an AMD 6700XT as the prices dropped. I chose the 6700XT as it seemed like the best performance at the lowest power/cooling requirement.

Gaming has been flawless with just the in-kernel drivers using RADV. No addon software stack required, "it just works". I play a lot of games using Proton/Steam, so if issues were going to pop up, it would be there. I had some initial confusion as I thought I'd need to download the drivers from AMD, but as it turns out everything works great in kernel 6.x automatically provided you have the non-free firmware packages installed.


Was there anything you did to help run the game after the switch? I have a 7900 xtx and WT crashes after each match :/

I am running Fedora w/ Wayland.


I'm not running Wayland, so I can't answer questions on that. Also, you have the 7900, and I imagine the drivers for that are pretty new. You'll want a fairly recent kernel (6.1+) You might want to look for updated AMD firmware on kernel.org - do a "grep -i firmware /var/log/messages" or syslog or whatever Fedora is using now. The output of "dmesg" probably also contains some clues. feel free to post back here if you find stuff.


For stable diffusion, I have found the DirectML fork works great (https://community.amd.com/t5/ai/how-to-automatic1111-stable-...)

Needs a few start arguments (--lowvram --precision full --no-half --skip-torch-cuda-test). YMMV.

GPT4All also works well for LLML on AMD.


Many thanks for the reference to GPT4All. I wasn't aware of it before.


Kind of. Here's the basic problem:

1) The BIG tools have SOME port to AMD. I am confident that one can run Mistral or SD in some framework there.

2) It's a port. If you want to do anything bleeding edge, development, niche, or similar, it's a dumpster fire.

I do dome amount of development which uses LLMs. I want to be able to use a model directly from Hugging Face and have it just work.


Only reason I am using the DirectML fork of Automatic1111 is because I am on Windows and pytorch hasn't caught up to RocM 6.

DirectML is fully supported path on Windows and is support by Microsoft et al. (https://github.com/microsoft/DirectML).

Everyone is moving off Cuda as quickly as possible not because the other are better, per se, but because it is easier and cheaper.


I've had very little issues (other than not amazing perf) with my last-gen 6750XT running llamacpp and SD for the last 8-10 months.


It doesn't mention (or I missed it), but I wonder what the idle power usage is on these. I know with Nvidia cards you can run nvidia-persistenced to keep it in a lower power state on idle, not sure what to do with AMD cards (if anything is needed).


Allegedly, it depends if the card can clock down the memory. My 7900 XTX still goes to ~100W when I have two 1440p monitors at high refresh rate, purely based on how many pixels it pushes.


It's RDNA2 rather than 3, but my 6800xt idles at 36W. I have a 4k TV and a 1440p monitor connected. It looks like the memory idles at 1000Mhz.


Assuming AMD's $330 MSRP holds, it seems like a pretty weak value proposition. Remaining stock of the 6700XT can be found in the $330-$350 range, and depending on the benchmark, it equals or beats the 7600XT handily.


I'm just glad AMD seems to have sorted most of the issues with RDNA3 on linux. It's now worth it to consider the 7800 XT as a good upgrade path, if the price is right.

Wendell from level1techs/level1linux has shown the 7600XT as a good cheap AI card due to it's higher VRAM compared to other similar cards.

The review is very favorable as a decent card for linux users: https://www.youtube.com/watch?v=phJVjfGQAyI


Interesting - I bought an RTX 3060 Ti last year because I didn't really know what information to trust w/rt what makes a "good budget AI card" vs a "bad budget AI card." As much as I'm feeling the limited RAM (really should have gotten the 12GB 3060), I just wouldn't even know where to start when setting up an AMD card for GPGPU tasks. It feels like every idiot-friendly guide for home AI tinkering assumes you have an Nvidia GPU.


Been running a 7900XTX/7950X3D build since end of November with zero issues (latest fedora).

General dev usage nothing AI though.


I've also been running a 7900 XTX on NixOS unstable since October or so and have experienced no issues whatsoever.


For a 6700, or even a 5700, the 7600 doesn't seem like much of an upgrade. You'd need to get a 7700 or above to make it worthwhile, probably. Or just wait for the next gen. Which is likely to be my own choice.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: