Hacker News new | past | comments | ask | show | jobs | submit login

Abstract the LLM calls into one layer, bundled with the prompt engineering for your use-case.

Et viola, your application is now free from LLM lock-in.




Well, maybe at an API and prompt level. But if Google pull ahead in this space then you may become dependent on what it alone can do functionally. Even if you can trivially switch LLM and prompt, if the others aren't able to do something equivalent (or at the same level of quality) then you're still locked in. Until now we've basically had this situation with OpenAI.


Why is vendor lock-in a concern if no other vendor offers that functionality?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: