Hacker News new | past | comments | ask | show | jobs | submit login
AOC's Deepfake AI Porn Bill Unanimously Passes the Senate (rollingstone.com)
85 points by doener 44 days ago | hide | past | favorite | 89 comments



The text of the House's version of the bill: https://www.congress.gov/bill/118th-congress/house-bill/7569...

* 10 year statute of limitations

* $150,000 limits in damages plus court costs and attorney fees

The crime:

> The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual, regardless of whether the visual depiction indicates, through a label or some other form of information published with the visual depiction, that the visual depiction is not authentic ... [and]

> (I) the identifiable individual did not consent to such production, disclosure, solicitation, or possession;

> (II) the person knew or recklessly disregarded that the identifiable individual did not consent to such production, disclosure, solicitation, or possession; and

> (III) such production, disclosure, solicitation, or possession is in or affects interstate or foreign commerce or uses any means or facility of interstate or foreign commerce


So this bans photoshop, too? Maybe even oil painting (oil paints are a technology, and an unauthorized nude oil painting of a celebrity would be indistinguishable from an "authentic"/authorized nude oil painting).


As I read it this allows civil suites against persons for publishing up a photo where the publisher used the "remove element from background", much less "any Instagram filter" without the explicit consent of the photographed person.


INAL, not legal advice:

The definition of digital forgery references the existing legal code's definition of "Intimate visual depiction", which is defined here: http://uscode.house.gov/view.xhtml?req=(title:15%20section:6...

So if you take an otherwise legal picture of someone, and add a filter which, eg, gives them a big forehead, you're fine. However, if you take a picture of and use it to create realistic porn of that person, that can sue.


This discussion section has convinced me that high school civics courses need to teach how to read bills. We’ve got (lots of!) people assuming that legislators’ statements of their own motivations represent the law itself; that if something’s not in the text of a bill that explicitly modifies another law then it must just not exist (… try the law it’s modifying) and tons of other mistakes; and in general most of the discussion being about as silly as one might expect if the text of the law were secret rather than readily available and not even that hard to read.

Plus all the usual not understanding how the government, and especially the first amendment, works, which is a separate but also serious problem.


I don't think it would do that, unless the original was an "intimate visual depiction."

However, I do think it would allow, say, Trump to sue people who made or distributed an image of him appearing in a diaper (IIRC I think that has been a fairly common way to mock him as a crybaby).


INAL, not legal advice:

The existing legal code which this bill modified is linked from the bill and available here: http://uscode.house.gov/view.xhtml?req=(title:15%20section:6...

In this, it defines "intimate visual depiction". I see nothing here that would qualify in an image of Trump being in a diaper.


So, people should read the bill.

Roughly, my take is: It does feel true that this is hasty and responsive; there's an obvious "demand" for this kind of legislation.

That being said; I don't find the bill itself to be particularly problematic. The actual changes to the law are kind of minimal. "Digital forgery" as they use it, sure, could apply to Photoshops as well.

Big picture here, I have no qualms with this, even if it is more "show" than "substance."


>"Digital forgery" as they use it, sure, could apply to Photoshops as well.

It probably should. I don't think the technology matters, what is created in it does.


Yeah, the relevance of the technology is only in whether it's a big enough problem to be worth legislating. It would be weird for it to be legal to make fake porn of someone only if you put some real effort into it rather than doing it the easy way.


Yeah, Photoshopping people's faces into porn and posting it on the Internet doesn't strike me as much different than using AI to do the same, except that it might be more obviously fake.


Yea. We use to call it "morphing" we were teenagers. We created a bunch ourselves of movie actors and shared among ourselves as a fodder for mastrubation.


Not a lawyer; agree it seems pretty reasonable and not-overbroad.

IMO the most subjective aspect is what constitutes likeness. I presume that's left to judge/jury's discretion? The range of face-recognition ability is pretty broad.


If you read the actual law, it looks like this law pertains to "digital forgery" - computer-generated material that "falsely appears to be authentic". Does this mean that AI-generated porn is OK as long as it is explicitly labelled?

South Carolina is currently in a funny state where it's illegal to distribute non-consensual AI-generated porn of someone, but there aren't any laws against non-consensual revenge porn. Good news is that revenge porn laws have been adopted almost country-wide.


That is explicitly addressed in the bill text:

> regardless of whether the visual depiction indicates, through a label or some other form of information published with the visual depiction, that the visual depiction is not authentic


I wonder if CivitAI will stop hosting LoRAs of real people out of fear of liability now. I don't know if they explicitly condone deepfake porn, but they do host NSFW baseline models and LoRAs of real peoples faces under the same roof, so it's not hard to put two and two together.


You can make this without loras too (e.g. elon and AOC and many celebrities exist without extra steps). Wonder how this affects open source models in general


If they do, they'll just open the floor to a foreign company that doesn't care about following US law.

I don't think this changes much on the tool side, especially because most of those are already open source and can be hosted anywhere, but it goes after the people who order these images made, who are usually Americans and hence subject to US jurisdiction.


Anyone know of any decent SFW alternatives to CivitAI?


Every time I go on that website and see the “age slider” Lora, I wonder how the fuck that website continues to exist.

There is so much icky shit in text-to-image and I’m frankly amazed that the employees of stability AI have not gotten in trouble yet for the fact that they used CSAM during training of stable diffusion 1.5.

https://www.theverge.com/2023/12/20/24009418/generative-ai-i...


And yes, during training, you have to at some point have its bytes on the training machine (not just links). By that definition, the folks who authorized training of their model with this dataset technically committed a big scary crime.


I'm guessing you mean a child-porn related crime here, which -- maybe -- but we're still in the process of determining whether the other crime/tort did, which would be copyright infringement.



"to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual"

so if deepfake would be generated with distinctive indication that this is AI generated forgery then it would be ok? for example cyborg hand or palm and everything else like real person.


It seems fairly clear to me that if the depiction is obviously not real, that's OK - for example if it looks just like [celeb] but they have blue skin or 10 eyes or 6 legs, or it's a cartoon, etc.

But most people producing fake celeb porn are making images that, at first glance, look exactly like [celeb], and only on careful inspection do you see AI-telltales like wrong numbers of fingers. Easy to miss. The law is saying that these realistic images are not OK and the person depicted in them can sue, no matter how many disclaimers you add to the picture. It applies to well-made photoshops created by humans just as much as AI slop.


No, this is addressed in the bill text

> regardless of whether the visual depiction indicates, through a label or some other form of information published with the visual depiction, that the visual depiction is not authentic


visual depiction of the individual can include a cyborg hand so imo no, if there are obvious clues that some person is d-faked, it still applies


I guess they're applying the same test you would expect with libel; would the layman expect the speech to be truth or intentionally misrepresentative — i.e. would the image be construed as real.

Otherwise, parody and criticism using AI would be limited.


I think nothing less than a watermark or caption will suffice, but this is probably a problem that will have to be answered in courts. I can't believe they would pass a law with such ambiguity. What do these people do all day?


> The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake...

How can someone control if they receive a deepfake?


I don't think the bill says anything about receiving, but it says "solicitation or possession." So if you receive such an image against your will, delete it. Same as CP, I guess.


I don't think anyone attempted to apply logic to this law before passage.

It's more about feeling they get from its passage, not any actual affects.


Hey, so not trying to troll... I feel like that has to be stated when talking about this stuff.

I get that weaponizing revenge porn someone sent you can be damaging.

But I don't get what this law is trying to do.

Is Hustler v. Falwell still valid?

> In the case, Hustler magazine ran a full-page parody ad against televangelist and political commentator Jerry Falwell Sr., depicting him as an incestuous drunk who had sex with his mother in an outhouse.

https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell

As long as it's labeled "AI Generated" in the corner, does that make it OK?

What if a person wanted to generate porn of Eva Braun & Adolf Hitler -- who knows, just for the shock value... would that be banned? Even if it's a snuff film?

I can think of a bunch of things that aren't full on traditional porn... can I still make a video of Katy Perry eating a banana? Can I show Hugh Jackman making out with John C. McGinley (as Dr. Cox, of course)? Or what if I wanted a video of Elizabeth Hurley doing ASMR in a bikini, or in a burqa?

Falls back to, "What's porn, anyway?" is it still "I know it when I see it?" / "Anything that gives a judge an erection?" Or is it anything involving using AI to draw people? I make a drawing on Civitai... can someone come up and say, "That looks like my great aunt Trudy, take it down now!"

We've had Photoshop forever... I get that AI is maybe (emphasis on maybe) a bit more realistic than Photoshop. But to me, a law like this feels like it'll have unintended and far reaching connotations for how we use AI to create scenes from each of our own vivid and unique imaginations.


My answer to pretty much everything these days is "it's all marketing." The lawmakers need to be seen as though they are doing something about what they perceive as a hot topic these days.

However, most laws have some vagueness about them, and, as intended, I believe it will fall to the courts to interpret the law.


You are correct, this new law is absolute nonsense and shows a complete misunderstanding of the technology.

All your examples are now indeed "illegal" to produce except the one involving only Hugh Jackman and John McGinley kissing as this law is ONLY about "protecting women" and "women's bodily autonomy" unfortunately AOC seems to think men don't have bodies apparently?


Just read the actual law... it says "identifiable individual" in the language -- so men and women (and anyone else!). It also limits the damage-seeking to "intimate" material, but doesn't really explain what intimate means. It also says you can't make the visual depiction, even you put "This is an AI fake" on it.

> DIGITAL FORGERY.—The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual, regardless of whether the visual depiction indicates, through a label or some other form of information published with the visual depiction, that the visual depiction is not authentic.

https://acrobat.adobe.com/id/urn:aaid:sc:us:83d22542-02aa-46...

A law like this would have 100% clashed with the ruling of Hustler Magazine, Inc. v. Falwell (1988).

I don't know who else remembers the 80s... but Tipper Gore and Nancy Reagan were the bad guys. Censorship, no matter who it is supposed to help, will be twisted by those looking to silence their opponents.

Dee Snider's speech before Congress is as relevant now as it was then. "There is no authority who has the right, and necessary insight to make these judgements..."

https://www.youtube.com/watch?v=S0Vyr1TylTE


> doesn't really explain what intimate means

It doesn't have to. The act exclusively amends 15 USC 6851, which already defines it in §6851(a)(5): http://uscode.house.gov/view.xhtml?req=(title:15%20section:6...

    (5) Intimate visual depiction
    The term "intimate visual depiction"-
    (A) means a visual depiction, as that term is defined in section 2256(5) of title 18, that depicts-
        (i) the uncovered genitals, pubic area, anus, or post-pubescent female nipple of an identifiable individual; or
        (ii) the display or transfer of bodily sexual fluids-
            (I) on to any part of the body of an identifiable individual;
            (II) from the body of an identifiable individual; or
            (III) an identifiable individual engaging in sexually explicit conduct and
    (B) includes any visual depictions described in subparagraph (A) produced while the identifiable individual was in a public place only if the individual did not-
        (i) voluntarily display the content depicted; or
        (ii) consent to the sexual conduct depicted.
> A law like this would have 100% clashed with the ruling of Hustler Magazine, Inc. v. Falwell (1988)

It would not. The parody advert at the heart of Hustler vs. Falwell did not include an "intimate visual depiction" of Falwell. It had a stock portrait of him, and lurid satirical text purporting to be his own words: https://upload.wikimedia.org/wikipedia/en/5/5d/Falwellhustle...


The actual text of the bill doesn’t mention gender. It affects fake depictions of men, too.


I'm glad the, Violence Against Women Act, "doesn't mention gender", thank you for letting me know.


Can you please stop posting ideological battle comments to HN? You've been doing it a lot again lately, and we have to ban such accounts.

I don't want to ban you again but if this keeps up we're going to have to.

https://news.ycombinator.com/newsguidelines.html


The law is about "digital forgeries" in general - trying to pass off any computer-generated fake as authentic - but they're marketing this as just the "deepfake porn law" for voter popularity.


The act's text: https://www.congress.gov/bill/118th-congress/senate-bill/369...

It modifies US Code section 6851: Civil action relating to disclosure of intimate images http://uscode.house.gov/view.xhtml?req=(title:15%20section:6...

The existing law allows a person with standing to bring a civil action against distributors of commercial pornography. This act amends that to not just include the commercial pornographer having actual pictures/video of the person, but also if they generate realistic pictures/video of the person.

Hitler and Braun are both dead, so they lack standing.

Falwell's case did not include an "intimate visual depiction" so he also couldn't bring a civil action under this section of the code. The requirement of "intimitate visual depiction" would also let you away with Katy Perry's fruit consumption, and provided there are no genitals, anuses, sexual fluids, etc. you could depict Jackman and McGinley... you could also be safe and render it as a cartoon, as the depiction has to be "indistinguishable from an authentic visual depiction of the individual"


Good. I’m not a lawyer but “knowingly” seems to be a central point here. Does an AI image generator “know” it’s being asked to make a nude of Jane Smith? The person asking it certainly does. Does their “artist for hire”?

If this makes it all but illegal for AIs to generate images of people in the general case (e.g. outside a Hollywood studio using it for their own actors per contract), fine. So be it. Nothing of value would be lost.

“Oh no, I can’t add an uncanny valley picture to my blog” is a small price to pay for “I can’t get deepfakes of Taylor Swift anymore”.

I’m usually far on the other side of things like that, like Congress’s endless requests to make cell phones detect nudity in text messages. I guess it’s mainly that I don’t see value in AI generated pictures of humans. Having that capability hasn’t improved anything that I can see.


It's much more narrow than that, if you read it.

It appears to be "knowingly make the naughty picture and also know that it's directly tied to a person and that it's naughty"


Minus the naughty bits. The text doesn’t distinguish what the person’s doing.

I suspect the courts will find that “make a picture of Jane Smith eating dinner” should have been caught: someone asked for it by name. If you provided a sufficiently detailed description of Jane, like you would to a police artist, and the end result ends up looking like Jane although she was never mentioned, that would seem a lot trickier to pin on the AI.


There does not appear to be a "naughty" provision only an "intimate" one. What qualifies as "intimate"? Seems dangerously broad.


The bill is here: https://www.congress.gov/bill/118th-congress/house-bill/7569...

Which links to the legal code which is modified here: http://uscode.house.gov/view.xhtml?req=(title:15%20section:6...

The original legal code defines "intimate visual depiction".


Do you ask this question about whether a gun is able to "know" that it's shooting someone?


No, nor of a paint brush. Those are dumb tools that do nothing without direct human action.

I can’t tell a gun “shoot John Doe”. If I could, I’d likely be OK with a law holding the gun maker responsible for it. “You shouldn’t make guns that can target people by name” would be the analogous conversation to “you shouldn’t make machines that can crate fake images of humans by name”.


It'll move to lawless eastern europe wild west probably.


> The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography

produce? why should it be illegal to create any forms on my own machine? where is the harm? or is this moral prudence at new level of invasiveness? is it now illegal to imagine someone naked?

distribute? yeah, this is a dick move

receive? so am i in trouble if someone else sends me this? it's now illegal to look up those swift fakes? why?

it seems more capable ai is being used as an excuse to limit our freedoms even more instead of the other way around. is there any system of law based on reason instead of votes?


What about other deepfakes?


A good question. I could think of at least one valid use for a deepfake: Imagine a deceased father, and his daughter is getting married. You could deepfake a father-daughter dance at the reception (with the daughter's permission).

But that's one use, and I can think of very few other legitimate uses, and many, many, many uses that I think are illegitimate.

A cynical view would be that they'll get around to legislation that covers other deepfakes once too many politicians get their reputations ruined by deepfakes of them doing things that they didn't do (or did but nobody has real video to prove it).


Thinking this through as I write it; please forgive if it’s a little disjoint.

I could imagine a scenario where AI companies doing license their API to anyone with a credit card, but to someone willing to sign a contract with indemnification. Maybe then the daughter wouldn’t go directly to some AI website, but might hire a local or online human artist who had such a contract. Maybe she wouldn’t look for “artists who can do AI stuff” but “artists who can add my dad to my picture”. The art is what she’s paying for, not the specific technology used to create it. And the human artist would think “well, I obviously have the daughter’s permission. I can’t ask the dad for his, but that’s a normal thing for a dad to do, and it seems unlikely he’d mind” and accept the commission. Maybe it turns out there are dads who don’t want to be involved, like they disowned their crackhead kid or something. That risk would be part of the human judgment going into the project. Today the woman could ask an AI “can you add a picture of me dancing with my dad, King Charles?” A human artist would say “that ain’t your dad” and reject it.

Maybe all but requiring a human in the loop wouldn’t be a bad idea.


It seems she only addressed the issue as it impacted her, by amending the violence against women act and targeting it at corn.

More general deepfakes are still a big issue in politics and business, as well as other areas I'm sure.

I get this would be a personal issue to her, after AI images of her made the rounds, but I would have hoped congress would address this more broadly, as it still impacts them all when it comes to election misinformation. Maybe they figure if they ban it they can't take advantage of it when it works in their favor?


This is great. Although I don't quite get the connection to abortion.


The argument is that both abortion and deep fakes are about bodily autonomy. Autonomy to choose to carry a foetus to term and the autonomy for one's corporeal image to not be depicted in carnal acts they didn't perform.

EDIT: corporal to corporeal


A body and an image of a body are two totally separate things.


But the human element is quite related.

Consider the following:

A) One of your co-workers touches you inappropriately without your consent.

B) One of your co-workers distributes pornographic images of you to your peers without your consent.

It's easy to say that "touching" and "distributing images" are not "the same thing", and also both scenarios are blatant sexual harassment.


Not sure if if you're just not understanding what the comment you replied to is saying or if your are just responding in bad faith.

The comment doesn't say the 2 are the same. They say that both should fall under your consent because they are either your actual body or a depiction of it...

Not sure how taht can be any clearer.


[flagged]


No I think your the one who is insane for misrepresenting her position like this.

AOC derangement syndrome I think it's called.

I suggestion you read the article again and really try to actually understand what is being proposed and not be blinded by your very obvious bias against her.


Please don't cross into personal attack, no matter how mistaken someone is or you feel they are. It's not what this site is for, and destroys what it is for, and we have to ban accounts that do it.

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful.

Edit: it looks like you've been using HN comments primarily for political or ideological battle. That's not allowed here, regardless of your political views, and we have to ban accounts that do this as well, so it would be good if you'd please stop doing that. (For more explanation, see https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme....)


I like AOC, that doesn't mean I am brain dead and support everything she does and says without thinking.

The world has a place for zealots and I'm sure either political party would be happy to have you.


Personal attacks are definitely going to get you banned again on HN so please don't post anything more of the sort.

See also https://news.ycombinator.com/item?id=41072117.


Sure seems like you know a lot about AOC and her feet.


A couple years ago in was playing around with both AI image enhancement (way before Dalle) and also having fun trolling Ben Shapiro about his obsession with AOC.

I went down the fun rabbit hole of leaning AI image generation and more statistics while trolling Ben Shapiro with high quality foot pics for his viewing pleasure. All told it was a fun learning experience and yes, at the time I had seen probably every single picture of AOCs feet for the training potion. Pretty sure I put the project in GitHub under my pseudonym.


> Autonomy to choose to carry a foetus to term

No i think its the opposite in this context. Its the autonomy to terminate the fetus and not carry it to term.


I worded it unclearly, I meant that a person should be able to choose to carry a foetus to term rather than be obligated to.


*corporeal


Corporal means relating to the body.



[flagged]


> “(A) IN GENERAL.—The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.

For reference, the definition from the bill.

https://www.congress.gov/bill/118th-congress/senate-bill/369...

For those not familiar with reading US congressional legislation, there’s usually a short statement of purpose, a bunch of stuff about what Congress is thinking (or claiming to think) about the topic and their motivation for the law, then you get some definitions that will matter for the law itself (the explanatory preamble may contain things that look like definitions, but they matter less than the ones in the law proper), then the meat of the law.


Not a problem: the supreme court will likely rule in your favor.


When does this type of censorship (yes, that's what it is, though maybe a positive kind) run into the first amendment? If I make a deep fake of Trump making love to Lindsay Graham, when does it become protected speech? Do I need a storyline? Does there need to be a moral bent? Does Trump need to say something about Graham "crossing his border?"


By my estimation this runs head first into the first amendment and likely does not survive. It also likely violates the equal protections clause as it is only granting "identifiable individuals" (which does not appear to be defined?) special rights.

There is a mountain of case law that states that public figures have limited protections from criticism, mockery, etc. I am not sure humiliation will hold up any better.

They likely knew this as they add this at the end:

> If any provision of this Act, an amendment made by this Act, or the application of such a provision or amendment to any person or circumstance, is held to be unconstitutional, the remaining provisions of and amendments made by this Act, and the application of the provision or amendment held to be unconstitutional to any other person or circumstance, shall not be affected thereby.

Also this definition is still quite broad?

> “(3) DIGITAL FORGERY.—The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual, regardless of whether the visual depiction indicates, through a label or some other form of information published with the visual depiction, that the visual depiction is not authentic.”;

So it doesn't even need to be pornographic, just "intimate".


Not a lawyer, but my understanding is that (broadly speaking) fraud and other malicious deception are not considered to be protected by the first amendment. There are some complications re. your example in that political speech has more protections. https://www.freedomforum.org/is-lying-protected-first-amendm...


It seems this only applies to "women's bodies" so your fine making Trump porn all day.


Where’d you read that in the law?


Whether or not it also applies to men, it was written with women in mind.

"when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away"

"Current laws don’t apply to deepfakes, leaving women and girls who suffer from this image-based sexual abuse without a legal remedy"

"The legislation would amend the Violence Against Women Act (VAWA) "


Don't let the pesky first amendment get in the way of a emotionally charged law. This is literally how all rights get taken away. You find an issue that is emotionally charged, then say some "new doohickey" causes it to now be different than it ever was in the past (fear mongering), talk about how this could possibly be impacting and we must take extreme measures by limiting freedoms (cast uncertainty), finally indicate that society will crumble due to this (doubt).

Same playbook regardless of issue. People in power stirred up by busy bodies have been doing this repeatedly for 100 years. I do find it odd that originally these kind of things were pushed by religious fundamentalists. Many laws were passed based upon religious fundamentalist ideas that restrict behaviors and restrict speech. The entire liberal push used to be to strip down these restrictions because they fundamentally violated some of the core freedoms we're granted in a free country.

Now it seems the script has flipped. I see more liberals pushing for restrictions of Rights and freedoms than I do religious conservatives. No good deed goes unpunished and no emotionally feel good law will remain unabused by those in power.

So while it's easy to agree with this core idea that deep fake porn is bad, it's conveying a new right that has never been conveyed before. This idea that you have ownership of your image and how it is used. Depending on how you twist the words this could crush a satire completely. This can crush freedom of artistic expression. This is ripe for abuse because it's so emotionally charged is why you get a unanimous but vote. Politicians hoping to gain brownie points but understand the court will largely gut this bill. Unfortunately the people who have their rights violated, the actual enumerated and protected rights, not the emotional feel good things people think are rights, have no recourse other than spending large amounts of money or hoping the ACLU will take up this fight.


When the first amendment was drafted 233 years ago do you think they anticipated that one day anyone would be able to speak in anyone else's voice, indistinguishably from if they had actually said it? The constitution does occasionally need re-evaluating in light of new technology, like when the right to bear arms was effectively narrowed to the right to bear some arms when the most powerful weapons vastly outstripped what the general public could reasonably be expected to handle. Few people will unironically argue that 2A means that everyone should be allowed to have a nuke.


Do we outlaw highly skilled impressionists? Pretty sure those existed 233 years ago.


Highly skilled impressionists portraying themselves as the real thing? Yes. That’s fraud.


But if we make it clear the impressionist is doing an impression of a famous person we're good? Because this law disregards that, you can clearly label something as "fake" and it doesn't matter, you're still liable as long as it appears authentic. A good impressionist would appear authentic.


>I see more liberals pushing for restrictions of Rights and freedoms than I do religious conservatives

This doesn't appear to be true. This is a bipartisan bill, same as KOSA (which most liberals also do not want).

For most topics that upset liberals (e.g. racists on Twitter), they're not asking for laws, they're asking for private companies to ban them.

Meanwhile, religious conservatives are actually banning books from libraries, and getting teachers fired.


> Depending on how you twist the words this could crush a satire completely. This can crush freedom of artistic expression. This is ripe for abuse because it's so emotionally charged is why you get a unanimous but vote.

Who's fear-mongering now?

If you read the bill you'll see that to be liable 1) it must depict an intimate scene, 2) it must be done knowingly, 3) it must be indistinguishable from a real person and 4) it must be done without their consent.

That's a pretty reasonable standard IMO. This certainly doesn't include political speech or satire, unless your satire must include photorealistic porn.


5 years ago, I couldn’t say “hey computer, make me a video of Jane Smith having sex with 3 guys, and also John Doe pointing a rifle at a baby”. I missed the part of the 1st amendment that gives me an inalienable right to make realistic images without someone’s permission of them doing things they wouldn’t do.


Wouldn't a more effective way be to hold any maker of a porn deep fake financially liable for lost revenue?

Just like if I made a movie using a famous actor using AI.


If I made a fake movie of an unattractive person I knew, and they wouldn’t lose any revenue by it existing, shouldn’t they have the right to make me stop anyway?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: