Hacker News new | past | comments | ask | show | jobs | submit login
Did I just create the world's smallest AI server?
10 points by willmccollum 73 days ago | hide | past | favorite | 4 comments
I successfully installed Ollama in Termux on my degoogled Unihertz Jelly Star, reputed to be the world's smallest smartphone, having a 3 inch screen. The Jelly Star packs 8+7 gb ram. I downloaded and then successfully ran distilled Deepseek-R1:7b locally on the device. Is it slow? Yes. But it still steadily outputs text word by word, does not crash, and takes no longer than a couple mins to respond in full. Anyone have other examples of micro AI workstations?



I'm running Ollama + Smollm on this thing: https://pine64.org/devices/quartz64_model_b/


Very cool to learn about that device. I want to see how small I can get. The main surprise for me was being able to run the 7B parameters version


Nvidia's Orin Nano Super is pretty recent and a very neat little piece of kit for the price: https://www.nvidia.com/en-us/autonomous-machines/embedded-sy...

Basically a 25 watt Raspberry Pi with an integrated CUDA-capable GPU. Haven't got one myself, but this probably represents the greatest size/performance inflection point for consumers right now.


>Unihertz

Man I had the Titan Pocket. I got it and loved the physical keyboard. What a disappointment though. The wifi would just cut out, and no solutions available.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: