Lying implies intent, and knowing what the truth is. Saying something you believe to be true, but is wrong, is generally not considered a lie but a mistake.
A better description of what ChatGPT does is described well by one definition of bullshit:
> bullshit is speech intended to persuade without regard for truth. The liar cares about the truth and attempts to hide it; the bullshitter doesn't care if what they say is true or false
I’ve come to the belief that making statements that may or may not be true, but with reckless disregard for whether or not they actually are true, is indeed lying.
Of course we know ChatGPT cannot lie like a human can, but a big reason the thing exists is to assemble text the same way humans do. So I think it’s useful rhetorically to say that ChatGPT, quite simply, lies.
But since this is about the law; ChatGPT can't lie because there is no mens rea. And of course this is a common failing with the common person when it comes to the law, a reckless disregard there of (until it is too late of course). And recklessness is also about intent, it is a form of wantonness, i.e. selfishness. This is why an insane person cannot be found guilty, you can't be reckless if you are incapable of discerning your impact on others.
It's not lying if ChatGPT is correct (which it often is), so repeating ChatGPT isn't lying (since ChatGPT isn't always wrong); instead the behaviour is negligent or in the case of a lawyer grossly negligent since a lawyer should know better to check if it is correct before repeating it.
As always mens rea is a very important part of criminal law. Also, just because you don't like what someone says / writes doesn't mean it is a crime (even if it is factually incorrect).
>Lying implies intent, and knowing what the truth is. Saying something you believe to be true, but is wrong, is generally not considered a lie but a mistake.
Those are the semantics of lying.
But "X like a duck" is about ignoring semantics, and focusing not on intent or any other subtletly, but only on the outward results (whether something has the external trappings of a duck).
So, if it produces things that look like lies, then it is lying.
A person who is mistaken looks like they're lying. That doesn't mean they're actually lying.
That's the thing people are trying to point out. You can't look at something that looks like it's lying and conclude that it's lying, because intent is an intrinsic part of what it means to lie.
2) "the camera cannot lie" - cameras have no intent?
I feel like I'm missing something from those definitions that you're trying to show me? I don't see how they support your implication that one can ignore intent when identifying a lie. (It would help if you cited the source you're using.)
>2) "the camera cannot lie" - cameras have no intent?
The point was that the dictionary definition accepts the use of the term lie about things that can misrepresent something (even when they're mere things and have no intent).
The dictionary's use of the common saying "the camera cannot lie" wasn't to argue that cameras don't lie because they don't have intent, but to show an example of the word "lie" used for things.
I can see how someone can be confused by this when discussing intent, however, since they opted for a negative example. But we absolutely do use the word for inanimate things that don't have intent too.
either a) you knew it was false before posting, then yes you are lying. Or b) you knew there was a high possibility that ChatGPT could make things up, in which case you aren't lying per se, but engaging in reckless behaviour. If your job relies on you posting to HN, or you know and accept that others rely on what you post to HN then you are probably engaging in gross recklessness (like the lawyer in the article).
A better description of what ChatGPT does is described well by one definition of bullshit:
> bullshit is speech intended to persuade without regard for truth. The liar cares about the truth and attempts to hide it; the bullshitter doesn't care if what they say is true or false
-- Harry Frankfurt, On Bullshit, 2005
https://en.wikipedia.org/wiki/On_Bullshit
ChatGPT neither knows nor cares what the truth is. If it bullshits like a duck, it is a bullshitting duck.