Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A Call to Legislate 'Backdoors' into Stable Diffusion (metaphysic.ai)
25 points by Hard_Space on Feb 15, 2023 | hide | past | favorite | 19 comments


If you outlaw good quality stable diffusion, then only outlaws will have good quality stable diffusion.


very true


This reads like something out of a dystopian nightmare. And something that is very very easily circumvented, i.e. generate a image similar to what you want to achieve, then photoshop the faces of the people into the generated image. Or do the authors propose that photo editing software also include backdoors to prevent this?



That's hardly the same, it's a very specific use case, and not all photo editors include the feature.


Need a brain backdoor to prevent people from fantasizing about celebrities. That'll stop it for sure.


That would be absurd! You'll merely have to purchase a crypto token that proves that you have a license to create a mental image of the copyrighted subject, which is after all the ultimate intellectual property.


So the reasoning seems to be: 1) We can create such tools 2) Then we must create such tools 3) They can be used to harm others 4) Then we must change the rest of society and the laws to adapt to them, alter publicly available images and impose restrictions on (preexisting) non-harmful actions

Beyond bizarre.


How is this supposed to be implemented exactly? Fundamentally, models can always be trained further. Even if somehow you find a way to make it impossible to retrain the removed capabilities, new models can be trained. Should models be proprietary and cloud hosted by law, then? What would the democratic consequences of that be, in the long term?

From a completely different perspective, should simply generating certain classes of images be considered a crime? Publishing or sharing them is already covered by existing laws. But legislating any further than that would surely be at odds with several human rights.

None of this adds up to me. Not technically, not legally and not ethically either.


Lol. You can't outlaw matrix multiplication.


Isn't this a similar argument to "you can't outlaw metalsmithing" if there was a ban on carrying knives?


There is such a ban in England, 4 years in prison, see how it's working out for them: https://commonslibrary.parliament.uk/research-briefings/sn04...


This is written by some researcher with big dreams for the technique they have just developed...

But the chance of their technique being mandated by law worldwide into a fledgling field of technology seems pretty much nil.


Yes. YES. It's happening. The cyberpunk future we've been waiting for. Corpos getting netwatch to hunt rogue AI. Gonna have my USB thumbdrive with illegal AI tech to run for chooms on the dark net.

Just need cyberware to happen and then if the future is gonna suck it can at least look interesting.


Still no flying cars though. :(


At least now we have 280 chars tweets instead of 140 !


Ignoring the policy side of things for now, as I'm sure we'll get plenty of discussion in other comments...

On a technical level how well do these adversarial attacks survive trivial transformations on the original pixel data? I would imagine even a simple JPEG compression pass would ruin the effect.


It was about time to create some ethical framework for text-2-image systems.


Sure, an ethical framework will be created.

And unethical actors will ignore it.

This will only affect those who are already interested in self-censoring. Tech unburdened by this laws will emerge regardless.

It's also illegal to hack and exploit systems. Never stopped happening.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: