Hacker News new | past | comments | ask | show | jobs | submit login
OpenAI regulatory pushing government to ban illegal advanced matrix operations [pdf] (regulations.gov)
26 points by udev4096 on June 17, 2023 | hide | past | favorite | 13 comments



Regulation in the way OpenAI appears to campaign for in this response is, in my eyes, going to have a very similar trajectory to the consistent and misguided attempts at enforcing backdoors in or a complete ban of modern encryption. Attempts will be made, by agencies and companies lobbying for legislation, alleged solutions will inevitably fail, and some misguided politician is always going to be more than happy to campaign on this topic.

Large Language Models, similar to modern cryptography back in the 90s, are already running locally and anyone intending to regulate them in any way is going to find it difficult to add restrictions to something of that nature.


Cutting edge models are quite expensive to train, so it'd be quite possible to restrict them (unfortunately).


You are absolutely right for the moment, but the question is whether this will always remain the case. I am honestly doubtful and fairly certain this will be more a question of when than if. Seeing as both Nvidia and Apple continue making consumer hardware that does well in inferencing, I fail to see why training sufficiently capable models locally may not become more common place, especially as there are already murmurs in the field that overly large models (100B+) may not be necessary for many use cases.

Add to that the fact that institutions (including research institutes[0][1] or universities outside US or EU jurisdictions) are likely to continue undertaking the task of training larger in size models, making the fruits of their labor freely accessible, I just do not see a way for centralized enforcement of specific LLM training rules.

[0] https://huggingface.co/tiiuae/falcon-40b-instruct

[1] https://www.databricks.com/blog/2023/04/12/dolly-first-open-...


Yeah, we will find ways of training these things faster, and we will have faster hardware. We will have better, more parameter efficient models. This will inevitably be something we can do locally. You can already trade time for money. Like, right now you can buy hardware for less than what a house costs, train using the hardware for a year and you basically have something as capable as GPT3.5 - this is already possible and needs to be done only once (and you can still sell the hardware if you want) and a single dedicated person is capable of doing it by themselves. It will only get faster, and cheaper from here. Same goes for inference, still expensive to run but will get cheaper.


They’re expensive for you, a random person on the internet, but 5-10M USD is really not prohibitively expensive for a government or large company.


Now they are being explicit about their intentions

> AI developers could be required to receive a license to create highly capable foundation models which are likely to prove more capable than models previously shown to be safe

I would like to see their faces when the Fcc deems that gpt4 is Not safe. All we have to do is show that it can act like a Trumpist


Moratorium on freaking out about AI https://kyledrake.com/writings/ai


Last section with the following heading: Registration and Licensing for Highly Capable Foundation Models


I trust Sam to prevent human extinction.


I presume that's sarcasm, considering he spends a lot of energy preparing for that event.


Sam loves human extinction fear because it will increase his shot of licensing. When Sam says "down" "up" "left" and "right", he is actually saying "down" "up" "right" and "left". Just takes a little translation, but he comes across loud and clear.


[flagged]


I think what you meant to say is:

"OpenAI desperately wants for nobody else to be able to come along and play in their sandbox so instead of facing competition head-on and playing fair, they are going to employ "regulatory capture" and try and enjoin government to grant themselves (and a few other incumbent players) an unfair advantage over newcomers."


Looks like a bot to me




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: