Hacker News new | past | comments | ask | show | jobs | submit login

> Download as many LLM models and the latest version of Ollama.app and all its dependencies.

I recently purchased a Mac Studio with 128gb RAM for the sole purpose of being able to run 70b models at 8-bit quantization






M4?

Mac Studios don't have the M4 yet. With that much RAM, GP must have the M2 Ultra.

well done, good for you!



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: