Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder if they have plans for allowing the usage of a locally hosted LLM?


There's an opensource IntelliJ plugin from https://github.com/continuedev/continue that allows you to do this. It supports a couple different providers and models, e.g. LocalAI with Code Llama.


Yesterday I started to explore CodeGPT, it allows to download and run local model via llama.cpp, it's working fine for me so far, at least with with DeepSeek model 6.7b https://plugins.jetbrains.com/plugin/21056-codegpt


This is the first thing I checked on the announcement page, no mention though :/ https://www.jetbrains.com/ai/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: