He's current secratary of state for the department of health and social care. He's by far the most tech-orientated SoS we've had for years, doing a lot of work to push digital in health. He's rampantly pro-IT.
Sometimes when politicians make requests like this (make it harder to access images of self harm) people dismiss them as "think of the children". That would be a mistake here. He's not asking for all images to be removed; he is asking for the malgorithmic pushing of self harm content to vulnerable people to be fixed.
People sometimes complain about laws that appear out of the blue. His tweet above is the start of a long slow proces of building a law. It's a clear warning: get better at self-regulating, or we'll regulate you.
"Self-harm images on Instagram just part of problem we need to address. In our national study, 1/4 under 20s who died by suicide had relevant internet use & most common was searching for info on methods"
"Important change in political/social attititude. Just a few years ago, internet seen as free space, no restrictions, complete lack of interest in #suicideprevention from big companies. Now mood is for regulation, social responsibility, safety."
Finally, here's my example of malogrithm ad placement. I've mentioned this example before, and I think it got fixed (so thank you if you fixed it!) but I search for suicide related terms for my work, and sometimes the ads are terrible.
> People sometimes complain about laws that appear out of the blue. His tweet above is the start of a long slow proces of building a law. It's a clear warning: get better at self-regulating, or we'll regulate you.
You're absolutely right! One reading is that it's a request. Please fix this problem, before we have to regulate you into fixing it.
Is it possible that there may be an alternative reading? A cynic might suggest that humoring such a plea is a great way to demonstrate that content problems like this can be solved! Then regulators can require those very useful tools be applied to whatever they please in a much more general way.
The odds of whatever Secretary Hancock gets to solve the very real, pressing problem he has so wisely pointed to being completely inapplicable to literally anything else are virtually zero. I can think of a few places where safety and social responsibility means things like never disagreeing with The Party.
As technologists, it's on us to think through the consequences of our choices where we can. It's often not plausible - nobody thought TCP/IP would lead to malgorithmic ads! But tools designed to enforce arbitrarily defined social mores?
The problem with this approach is that you're going to be fixing this "leaky pipe" forever. It also sets another precedent that if you make enough noise then our internet is going to be curated "for our own good".
He's current secratary of state for the department of health and social care. He's by far the most tech-orientated SoS we've had for years, doing a lot of work to push digital in health. He's rampantly pro-IT.
Sometimes when politicians make requests like this (make it harder to access images of self harm) people dismiss them as "think of the children". That would be a mistake here. He's not asking for all images to be removed; he is asking for the malgorithmic pushing of self harm content to vulnerable people to be fixed.
People sometimes complain about laws that appear out of the blue. His tweet above is the start of a long slow proces of building a law. It's a clear warning: get better at self-regulating, or we'll regulate you.
The lead for suicide prevention in the UK (Professor Louis Appleby) has this to say: https://twitter.com/ProfLAppleby/status/1089528954158043136
"Self-harm images on Instagram just part of problem we need to address. In our national study, 1/4 under 20s who died by suicide had relevant internet use & most common was searching for info on methods"
and this: https://twitter.com/ProfLAppleby/status/1089525522084884480
"Important change in political/social attititude. Just a few years ago, internet seen as free space, no restrictions, complete lack of interest in #suicideprevention from big companies. Now mood is for regulation, social responsibility, safety."
Finally, here's my example of malogrithm ad placement. I've mentioned this example before, and I think it got fixed (so thank you if you fixed it!) but I search for suicide related terms for my work, and sometimes the ads are terrible.
https://imgur.com/hhOYUJb