Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the the standard setup for vscode continue for ollama is already 99% of ai coding support I need. I think it is even better than commercial offerings like cursor, at least in the projects and languages I use and have tested it.

We had a Mac Studio here nobody was using and it we now use it as a tiny AI station. If we like, we could even embed our codebases, but it wasn't necessary yet. Otherwise it should be easy to just buy a decent consumer PC with a stronger GPU, but performance isn't too bad even for autocomplete.



Which models are you using?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: