Hacker News new | past | comments | ask | show | jobs | submit login

Had no idea Windows users had no access to Ollama, feels like only a few years ago we Mac users would have been the ones having to wait



It has worked just fine under WSL for many months now, including full GPU support, though that's not as convenient for most. Native Windows support is icing on the cake.


Indeed, WSL has surprisingly good GPU passthrough and AVX instruction support, which makes running models fast albeit the virtualization layer. WSL comes with it's own setup steps and performance considerations (not to mention quite a few folks are still using WSL 1 in their workflow), and so a lot of folks asked for a pre-built Windows version that runs natively!


I've been running Ollama in Windows WSL for some time now.

It's x86 Linux after all. Everything just works.


There’s some magic with wsl gpu drivers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: