Hacker News new | past | comments | ask | show | jobs | submit login

most teams don't want to self-host, and definitely don't want to have to run on-device eating up their ram



I get the self-host part, but if you had a dedicated machine would the ram be an issue? Can you run it on a machine with like 128GB of ram or the GPU equivalent?


There is no reason these models will be selfhost only.


agreed, and I can't wait for gpt4 to have great competition in terms of ease, price and performance. I was responding to this

> something that should just be completely on-device or self-hosted if you don't trust cloud-based AI models like ChatGPT Enterprise and want it all private and low cost




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: