I think OP feels it indirectly creates massive personal liabilities for site operators, in that a user can deliberately upload illegal material and then report the site under the Act, opening the site operator up to £18M in fines.
This seems very plausible to me, given what they and other moderators have said about the lengths some people will go to online when they feel antagonised.
The UK has lots of regulatory bodies and they all work in broadly the same way. Provided you do the bare minimum to comply with the rules as defined in plain English by the regulator, you won't either be fined or personally liable. It's only companies that either repeatedly or maliciously fail to put basic measures in place that end up being prosecuted.
If someone starts maliciously uploading CSAM and reporting you, provided you can demonstrate you're taking whatever measures are recommended by Ofcom for the risk level of your business (e.g. deleting reported threads and reporting to police), you'll be absolutely fine. If anything, the regulators will likely prove to be quite toothless.
Hopefully the new law is enforced sensibly, i.e., with much leniency given to smaller defendants, but hoping for that to be the case is a terrible strategy. The risk is certainly not zero as you claim -- all it takes is for one high-profile case of leniency resulting in some terrible outcome (e.g., child abuse) getting into the news, and the government employees responsible for enforcement will snap to a policy of zero-tolerance.
> provided you can demonstrate you're taking whatever measures are recommended by Ofcom
That level of moderation might not be remotely feasible for a sole operator. And yes, there's a legitimate social question here: Should we as a society permit sites/forums that cannot be moderated to that extent? But the point I'm trying to make is not whether the answer to that question is yes or no, it's that the consequences of this Act are that no sensible individual person or small group will now undertake the risk of running such a site.
While they could, I'm pretty sure that's already illegal, probably in multiple ways.
In the same way that you could be sued for anything, I'm sure you could also be dragged to court for things like that under this law... And probably under existing laws, too.
That doesn't mean you'll lose, though. It just means you're out some time and money and stress.
>While they could, I'm pretty sure that's already illegal, probably in multiple ways
Heh, welcome to the internet where the perpetrator and the beneficiary can be in different jurisdictions that make enforcement on the original bad actors impossible.
For example, have a friend in China upload something terrible to a UK site and then 'drop the dime' to a regular in the UK. The UK state can easily come after you and find it nearly impossible to go after the international actor.
>In the same way that you could be sued for anything,
The risk and cost imbalance is much more extreme than that of a lawsuit.
I'm confident that, were I sufficiently motivated, I could upload a swathe of incriminating material to a website and cover my tracks within a couple of hours, doing damage that potentially costs the site operator £18M with no risk to myself -- not even my identity would be revealed. OTOH, starting a lawsuit at the very least requires me to pay for a lawyer's time, my face to appear in the court -- and if the suit is thrown out, I'll need to pay their court costs, too.
Again, sounds like the nonsense spoken on here when GDPR came out. Everyone was going to get fined millions. Except people with violations actually got compliance advice from the ICO. They only got fined (a small amount of money) when they totally ignored the ICO
This seems very plausible to me, given what they and other moderators have said about the lengths some people will go to online when they feel antagonised.