I’ve seen iOS apps that let you download open source LLMs like Llama/Mistral and run them locally. But is there any app/solution that would let me use an iPad as an inference backend from another computer on my LAN?
I’m curious to see whether it might be worth getting the new iPad Pro M4, which I’m guessing should be pretty fast at inference, but it’s obviously a very locked down system so I’m not sure if it’s viable.
It really does not make sense to pay for a screen and form factor you won't use though. You could make a $500 headless inferencing server with a few used 3060s and buy a used iPad Pro with the savings.
reply