Hacker News new | past | comments | ask | show | jobs | submit login

Me too. But I learned that not everyone is like me. And i general I also would not trust a LLM so much, that cannot divide between formal talk and ghetto slang. It will likely get other things wrong as well, humans will, too - so the error bar needs to be lower for me as a customer to be happier. I am not happy to get a fast, but wrong response and then fight for days to get an actual human to solve the mess.





I was assuming it was more of a text to speech error or typo in the records and it was supposed to say her first name there. Accidentally inserting super casual/offensive slang into a more formal conversation doesn't feel like a mistake LLMs tend to make very readily.

I've grown up in various neighborhoods. In no context would calling someone a slur like that when you don't even know them be acceptable .

That said, it's obviously a technical glitch. Let's say it was something really important like medication, would you rather wait two or three days to find out when it gets here, or would you rather have a glitchy AI say some gibberish but then add it's coming tomorrow




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: