Hacker News new | past | comments | ask | show | jobs | submit login

> For example, if next month you developed a model that could produce extremely high quality video clips from text and reference images, you did a small, gated beta release with no PR, and one of your beta testers immediately uses it to make e.g. highly realistic revenge porn.

You make a great point here. This is why we need as much open source and as much wide adoption as possible. Wide adoption = public education in the most effective way.

The reason we are having this discussion at all is precisely because OpenAI, Stability.ai, FAIR/Llama, and Midjourney have had their products widely adopted and their capabilities have shocked and educated the whole world, technologists and laymen alike.

The benefit of adoption is education. The world is already adapting.

Doing anything that limits adoption or encourages the underground development of AI tech is a mistake. Regulating it in this way will push it underground and make it harder to track and harder for the public to understand and prepare for.




I think the stance that regulation slows innovation and adoption, and that unregulated adoption yields public understanding is exceedingly naive, especially for technically sophisticated products.

Imagine if, e.g. drugs testing and manufacture was subject to no regulations. As a consumer, if you can be aware that some chemicals are very powerful and useful, but you can't be sure that any specific product has the chemicals it says it has, that it was produced in a way that ensures a consistent product, or that it was tested for safety, or what the evidence is that it's effective against a particular condition. Even if wide adoption of drugs from a range of producers occurs, does the public really understand what they're taking, and whether it's safe? Should the burden be on them to vet every medication on the market? Or is appropriate to have some regulation to ensure medications have have their active ingredients in the amounts stated, and are produced with high quality assurance, and are actually shown to be effective? Oh, no, says a pharma industry PR person. "Doing anything that limits the adoption or encourages the underground development of bioactive chemicals is a mistake. Regulating it in this way will push it underground and make it harder to track and harder for the public to understand and prepare for."

If a team of PhDs can spend weeks trying to explain "why did the model do Y in response to X?" or figure out "can we stop it from doing Z?", expecting "wide adoption" to force "public education" to be sufficient to defuse all harms such that no regulation whatsoever is necessary is ... beyond optimistic.


Regulation does slow innovation, but is often needed because those innovating will not account for externalities. This is why we have the Clean Air and Water Act.

The debate is really about how much and what type of regulation. It is of strategic importance that we do not let bad actors get the upper hand, but we also know that bad actors will rarely follow any of this regulation anyway. There is something to be said for regulating the application rather than the technology, as well as for realizing that large corporations have historically used regulatory capture to increase their moat.

Given it seems quite unlikely we will be able to stop prompt injections, what are we to do?

Provenance seems like a good option, but difficult to implement. It allows us to track who created what, so when someone does something bad, we can find and punish them.

There are analogies to be made with the Bill of Rights and gun laws. Gun analogy seem interesting because they have to be registered, but often criminals won't and the debate is quite polarized.


With the pharma example, what if we as a society circumvented the issue by not having closed source medicine? If the means to produce aspirin, including ingredients, methodology, QA, etc, were publicly available, what would that look like?

I met some biohackers at defcon that took this perspective, a sort of "open source but for medicine" ideology. I see the dangers of a massively uneducated population trying to 3d print aspirin poisoning themselves, but they already do that with horse paste so I'm not sure it's a new issue.


My argument isn't that regulation in general is bad. I'm an advocate of greater regulation in medicine, drugs in particular. But the cost of public exposure to potentially dangerous unregulated drugs is a bit different than trying to regulate or create a restrictive system around the development and deployment of AI.

AI is a very different problem space. With AI, even the big models easily fit on a micro SD card. You can carry around all of GPT4 and its supporting code on a thumb drive. You can transfer it wirelessly in under 5 minutes. It's quite different than drugs or conventional weapons or most other things from a practicality perspective when you really think about enforcing developmental regulation.

Also consider that criminals and other bad actors don't care about laws. The RIAA and MPAA have tried hard for 20+ years to stop piracy and the DMCA and other laws have been built to support that, yet anyone reading this can easily download the latest blockbuster movie or in the theater.

Even still, I'm not saying don't make laws or regulations on AI. I'm just saying we need to carefully consider what we're really trying to protect or prevent.

Also, I certainly believe that in this case, the widespread public adoption of AI tech has already driven education and adaptation that could not have been achieved otherwise. My mom understands that those pictures of Trump being chased by the cops are fake. Why? Because Stable Diffusion is on my home computer so I can make them too. I think this needs to continue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: