Hacker News new | past | comments | ask | show | jobs | submit login

This is great! I really wish Apple allowed your device to query a model you host instead of skipping to their cloud (or OpenAI). I'd love to have a Studio Pro running at home, and have my iPhone, iPad, Mac, and HomePod be able to access it instead of going to the cloud. That way I could have even more assured privacy, and I could choose what model I want to run.



Do you mean with Apple Intelligence? You can already query models you host from Apple using exo or even just local on-device inference.


Does this work with Siri? I'm not running the beta so am not familiar with the features and limitations, but I thought that it was either answering based on on-device inference (using a closed model) or Apple's cloud (using a model you can't choose). My understanding is that you can ask OpenAI via an integration they've built, and that in the future you may be able to reach out to other hosted models. But I didn't see anything about being able to seamlessly reach out to your own locally-hosted models, either for Siri backup or anything else. But like I said, I'm not running the beta!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: