Long time ML worker here. People work in one of 3 ways:
1) Consumer Nvidia GPU cards on custom PCs
2) Self hosted shared server
3) Cloud infrastructure.
There is no "GPU compute only card" that is widely used outside servers.
> company PCs normally ban playing games on company PCs and the overlap of "needing max GPU compute" and "needing complicated 3D rendering tasks" is limited.
The "don't play games thing" isn't a factor. Most companies just buy a 4090 or whatever, and if they have to tell staff not to play games, they say "don't play games". Fortnight runs just fine on pretty much anything anyway.
my point is a card bought for doing GPU computer not being able to work as a normal graphic card is not a problem
I'm aware that currently GPU compute only cards are not widly used outside of the server space.
But that's not because people need the consumer GPU features (bedsides video decode) but because the economics of availability and cost lead to consumer GPUs being the best option (and this economic effects don't just apply to customers but also Nvidia itself).
1) Consumer Nvidia GPU cards on custom PCs
2) Self hosted shared server
3) Cloud infrastructure.
There is no "GPU compute only card" that is widely used outside servers.
> company PCs normally ban playing games on company PCs and the overlap of "needing max GPU compute" and "needing complicated 3D rendering tasks" is limited.
The "don't play games thing" isn't a factor. Most companies just buy a 4090 or whatever, and if they have to tell staff not to play games, they say "don't play games". Fortnight runs just fine on pretty much anything anyway.