Hacker News new | past | comments | ask | show | jobs | submit login

> The interesting point here is that developers targeting the Mac can safely assume that the users will have a processor capable of significant AI/ML workloads

Also that a significant proportion (majority?) of them will have just 8 GB of memory which is not exactly sufficient to run any complex AI/ML workloads.




Easy solution; just swap multiple gigabytes of your model to SSD-based ZRAM when you run out of memory. What could possibly go wrong?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: