Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't have a PC with a powerful GPU. What's the easiest way I can play with Llama on AWS, Google Cloud, or somebody else's computer?


You can play with Llama on your CPU. Depending on the model you use and the RAM you have available, the performance may be acceptable.


using llama.cpp

it runs on the CPU

this news story is that they are now extending GPU support to llama.cpp


Do you know about Oobabooga?

You can probably find a google colab link.


Is that like a LLM Stable Diffusion? Neat.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: