I feel though that the point of the GDPR was to protect our personal data held by companoes, not to prevent companies using our personal data to make money.
So if a company uses your personal data to train a model (lets assume you willingly gave your informed consent for the time being), and then they delete your data after they have trained their model, does that model contain your personally identifiable inbformation? I'd argue that it does not - the model is just some weights, right? So 0.6 34.291, 0.0016 - is that you, mum?
.... but having just said that, I do wonder what happens if you run the model in reverse, like the deepdream stuff did (1). Could it re-generate PII (or rather generate "nearly-PII") purely from those weights?
I feel though that the point of the GDPR was to protect our personal data held by companoes, not to prevent companies using our personal data to make money.
So if a company uses your personal data to train a model (lets assume you willingly gave your informed consent for the time being), and then they delete your data after they have trained their model, does that model contain your personally identifiable inbformation? I'd argue that it does not - the model is just some weights, right? So 0.6 34.291, 0.0016 - is that you, mum?
.... but having just said that, I do wonder what happens if you run the model in reverse, like the deepdream stuff did (1). Could it re-generate PII (or rather generate "nearly-PII") purely from those weights?
1 - https://en.wikipedia.org/wiki/DeepDream