Hacker News new | past | comments | ask | show | jobs | submit login

> Of course not, but empathy in communication (not in action) is full of fluff. It almost requires it.

Empathy doesn't require fluff at all. (Think of all the short, poignant messages you've sent or received when someone is upset.)

The corporate need to not give ground on a complaint is where all that reassuring, repetitive, empty BS comes from.

> All "genuine" affirmation for the sake of empathy sounds cringe to me. I'd rather the doctor devoted their time to doing their job well instead of trying to make someone feel heard and validated - especially in an ER scenario where they are juggling critical patients. This tech can help with that.

Honestly with "for the sake of empathy" it sounds like you really don't place a high value on empathy (which is not unusual, or necessarily wrong). But if that is the case you're quite obviously not the right person to assess whether ChatGPT and the like can "help with that" in that context! :-)




>(Think of all the short, poignant messages you've sent or received when someone is upset.)

If it is short and not fluffy enough, it risks sounding dismissive. Those short messages generally pave way for a deeper conversation about the subject.

>Honestly with "for the sake of empathy" it sounds like you really don't place a high value on empathy (which is not unusual, or necessarily wrong).

Not really, I think empathy is important in the right setting, it is not the most important thing. Certainly not in the ER if the doctor is overworked and have lives at stake. If they have the bandwidth, sure. If not, can't blame them.

>But if that is the case you're quite obviously not the right person to assess whether ChatGPT and the like can "help with that" in that context! :-)

Disagree. I know how to sound empathetic for those that need it. Some people need words of affirmations and validation to be lifted. I am not one of them but I understand. It is not that hard. Modern LLMs are more than capable of creating the prose for that and more. There is a time and place for everything though. My empathy generally drives me to action and solving problems.


> I know how to sound empathetic for those that need it.

But that isn't actually empathy, and people can tell the difference.

> Some people need words of affirmations and validation to be lifted.

Not the words. The understanding and sharing that underpins the words.

My point is this: if you think empathy can be faked successfully, you simply aren't the right sort of person to decide whether the results of automated faking with an LLM are valuable to the listener.

Because people can very often tell when empathy is being faked. And when they do discover that empathy is being faked, you are not going to be easily forgiven.

Empathy implicitly involves exposing someone's feelings to the air, as it were, in order to identify that you understand and share them. So faked empathy is variously experienced as insulting, patronising, hurtful, derisive etc.

Using an LLM to create verbose fluffy fake empathy is going to stick out like a sore thumb.

If this isn't something you find easy to understand at a level of feeling, don't fake empathy, especially at volume. Stick to something very simple and an offer of contextually useful help.

> My empathy generally drives me to action and solving problems.

I think this is noble and valuable, and I would in your shoes stick to this. Offers of assistance are a kindness.

But you should never pretend to share someone's feelings if you don't share their feelings. Especially not in volume.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: