One of the most interesting parts of the EU AI act is section 105 which forbids unrestricted use of copyrighted content for generative AI.
It is specifically "opt-out" but does not specify how opting out should be declared.
I think it would mean that any web scraper that is mining data for a "generative AI" model would need to be able to detect copyright notices and interpret them correctly. I'd think that "All Rights Reserved" definitely means "No".
Just finished reading the Wikipedia article in full and it actually seems fairly sensible given how quickly this had to come together.
Draws boundaries in what seems like sensible places (potential for social harm, amount of compute involved in training etc).
It tries to get out of the way entirely of a bunch of non controversial use cases.
Honestly I don’t hate it at first glance. The biggest thing I think people will get upset with is like GDPR this doesn’t just target EU based companies but companies who want EU users.
The EU doesn't have the legal authority to legislate when it comes to national security matters - the member states themselves retain exclusive control.
I think the arbitrary amount of compute involved in training cutoff is the only problem. Plays into the whole imaginary agi Terminator scenario. I'm glad the rest of the act seems to be targeting specific use cases that do in fact need regulations like social scoring etc.
"manipulate human behaviour" falls under unacceptable risk unless you get an exception. Does that include AI girlfriends? What about embedded advertisements?
Yet another EU law that disadvantages small players in favour of large players by raising the cost to entry. They seem to have forgotten the legions of small players that played a big role in building this modern world.
And just like the cyber resilience act can be used to SLAPP pesky people. After years of work their future can be stolen from them and given to the wealthy.
The EU must really hate its citizens. Time to exclude EU government use from all your open source projects in the license.
Relatively open is also okay when they killing small players off.
Excluding government usage is hardly very closed.
Honestly some folks have worked for years on open source software, that government also benefited from, only to find that now the costs involved in paying for someone to certify your software plus the cost of the risk is too high. Politicians need to feel consequences did their decisions.
According to the OSI's "Open Source Definition" [1], indeed. It forbids discrimination against any person or group to use the software.
Most public source code repositories on the web require every piece of software published on them to be under an OSI-approved license. Not licensing under an OSI-approved license could therefore impede the chances of getting your software distributed and used.
I've been thinking on a related issue: perhaps there is a need to formulate a wider but still closed definition that repositories could adopt that would also include some "ethical" source code licenses and "Non-AI licenses" [2].
Because two key parts of the open source definition[0] are that the license "must not discriminate against any person or group of persons" (criterion 5) and that the license "must not restrict anyone from making use of the program in a specific field of endeavour" (criterion 6).
The same reason as to why you couldn't forbid military usage or commercial usage while still remaining Free Software.
"The freedom to run the program as you wish, for any purpose" also applies to entities such as the EU. If you forbid it from using the software, you compromise this freedom, and thus aren't Free Software.
It is specifically "opt-out" but does not specify how opting out should be declared.
I think it would mean that any web scraper that is mining data for a "generative AI" model would need to be able to detect copyright notices and interpret them correctly. I'd think that "All Rights Reserved" definitely means "No".