Hacker News new | past | comments | ask | show | jobs | submit login

LM Studio, sort of. Unfortunately my Macbook is incapable of running all but the smallest models. It would be great if LM Studio can connect to a remote model running on a GPU server that I can rent.



You could set up something similar using gradio quite easily:

https://www.jerpint.io/blog/model-inference/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: