Hacker Newsnew | past | comments | ask | show | jobs | submit | apotheora's commentslogin

This has strong implicit implications, the quality of output could never be really trusted? Is this a symptom of models being inherently lazy?

Compounding is probably the break point, one agent's output is another agent's input, does the garbage in garbage out rule apply?

I would use this filter: not whether an idea is absurd, whether someone commits time to build it

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: