Hacker News new | past | comments | ask | show | jobs | submit | abhyantrika's comments login

Is there any product that allows me to plug in my own Open source LLM and run it locally?


Yep. Continue.dev with ollama. Works but the suggests tend to be short. Like a smart auto complete

Needs a gpu though not a big vram one given that using smaller models for this is better due to speed


Tabby.ml also works well


Rubber duck does.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: