Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For something that's not rigorously defined, 99.999% and 100% is pretty frickin close together in my book. Like, TherapistGPT isn't going to randomly say you should go kill yourself.

Unfortunately, I'm not sure what your point actually is. Is ChatGPT in it's current form, a replacement for human contact? absolutely not. do people have strong emotions around something using a GPU and a bunch of math and was generated instead of being organically hand crafted by a human being, and having it fall into the uncanny valley? totally. is this box of matrices and dot products outputting something I personally find useful, despite shortcomings? yeah.

I agree that there's totally this brick wall feeling when ChatGPT spins itself in circles because it ran out the context window or whatever.

at the end of the day, I think the yacht rock cover of "closer" is fun, even though it's AI generated. however that makes you feel about my opinions.

https://youtu.be/ejubTfUhK9c



> Like, TherapistGPT isn't going to randomly say you should go kill yourself.

It won't literally do that, the labs are all careful about the obvious stuff.

But consider that Google Gemini's bad instructions almost gave someone botulism*, there's a high chance of something like that in almost every field. I couldn't tell you what that would mean in therapy for the same reason I wouldn't have known Gemini's recipe would lead to culturing botulism.

These are certainly more capable than anything before them, but the Peter Principle still applies, we should treat them as no more than interns for now. That may be OK, may even be an improvement on not having them, but it's easy to misjudge them.

* https://news.ycombinator.com/item?id=40724283




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: