Even a small startup, a researcher or a tinkerer can get a cloud instance with a beefy GPU. Also of note, Apple's M1 Max/Ultra should be be able to run it on their GPUs given their 64/128GB of memory, right? That's an order of magnitude cheaper.
I am confused. Those amounts are ram, not gpu ram, aren‘t they? Macs cpus are impressive, but not for ml. A most realistic one for a consumer is a 4090 rtx 24 GB. A lot of models do not fit in that, so A6000 48GB and over for some professional cards. That might be around 9000€ already.