Sounds like a totally backwards and nonsensical way to solve the (possibly legitimate) problems that they have:
- you can choose to release partially anonymized data (like without identities of judges etc.), and have laws preventing the use of ML to de-anonymize data statistically (examples of this can be seem with medical data, this can work fine)
- ...you CAN'T have the data available but prevent people from using it
Or did we completely misunderstood what this was about?
Dunno WTF is with this pattern of legislations cropping around the EU (like the copyright directives etc.) that seem the product of what can only be explained by (a) levels of incompetence bordering actual mental-impairment from the part of the legislators, or (b) a really dark agenda being played out behind the scenes (thing Germany cca. 1939-ish...).
This is sad because some of us really love the idea of the EU open society!
It's like there's some dark force trying to destroy it and its countries from within under the guise of "protecting" imaginary "rights".
- you can choose to release partially anonymized data (like without identities of judges etc.), and have laws preventing the use of ML to de-anonymize data statistically (examples of this can be seem with medical data, this can work fine)
- ...you CAN'T have the data available but prevent people from using it
Or did we completely misunderstood what this was about?
Dunno WTF is with this pattern of legislations cropping around the EU (like the copyright directives etc.) that seem the product of what can only be explained by (a) levels of incompetence bordering actual mental-impairment from the part of the legislators, or (b) a really dark agenda being played out behind the scenes (thing Germany cca. 1939-ish...).
This is sad because some of us really love the idea of the EU open society!
It's like there's some dark force trying to destroy it and its countries from within under the guise of "protecting" imaginary "rights".