Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How are you testing your LLM prompts?
5 points by luccasiau on April 2, 2023 | hide | past | favorite
How do you make sure changing your prompts will work well? How do you ensure it doesn't break anything that is already working?

Changing prompts in small ways can lead to loads of unpredictable behaviors. And that's even more concerning as we build larger apps on something like LangChain, that requires the output to be very rigid.

My instinct would be to run a unit test-suite for every prompt change. Is there some already-existing framework for those? Or otherwise, how are you testing your changes?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: