Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: iPad Pro as an LLM backend?
2 points by cal85 10 days ago | hide | past | favorite | 1 comment
I’ve seen iOS apps that let you download open source LLMs like Llama/Mistral and run them locally. But is there any app/solution that would let me use an iPad as an inference backend from another computer on my LAN?

I’m curious to see whether it might be worth getting the new iPad Pro M4, which I’m guessing should be pretty fast at inference, but it’s obviously a very locked down system so I’m not sure if it’s viable.






You would get a better price/performance out of almost literally any new hardware in it's price range. There are sub-$1000 Nvidia laptops that would run circles around it purely for backend purposes.

It really does not make sense to pay for a screen and form factor you won't use though. You could make a $500 headless inferencing server with a few used 3060s and buy a used iPad Pro with the savings.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: