Hacker News new | past | comments | ask | show | jobs | submit login

If you generate a question using an LLM, what’s to stop me from answering it using an LLM? And who verifies if the answer is correct? An LLM?



Finally, full automation of the review process is here.


In the future, no one knows how to do anything anymore, and it’s LLMs all the way down.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: