Then you will see that a forum that allows user generated content, and isn't proactively moderated (approval prior to publishing, which would never work for even a small moderately busy forum of 50 people chatting)... will fall under "All Services" and "Multi-Risk Services".
This means I would be required to do all the following:
1. Individual accountable for illegal content safety duties and reporting and complaints duties
2. Written statements of responsibilities
3. Internal monitoring and assurance
4. Tracking evidence of new and increasing illegal harm
5. Code of conduct regarding protection of users from illegal harm
6. Compliance training
7. Having a content moderation function to review and assess suspected illegal content
8. Having a content moderation function that allows for the swift take down of illegal content
9. Setting internal content policies
10. Provision of materials to volunteers
11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM
12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs
...
the list goes on.
It is technical work, extra time, the inability to not constantly be on-call when I'm on vacation, the need for extra volunteers, training materials for volunteers, appeals processes for moderation (in addition to the flak one already receives for moderating), somehow removing accounts of proscribed organisations (who has this list, and how would I know if an account is affiliated?), etc, etc.
Bear in mind I am a sole volunteer, and that I have a challenging and very enjoyable day job that is actually my primary focus.
Running the forums is an extra-curricular volunteer thing, it's a thing that I do for the good it does... I don't do it for the "fun" of learning how to become a compliance officer, and to spend my evenings implementing what I know will be technically flawed efforts to scan for CSAM, and then involve time correcting those mistakes.
I really do not think I am throwing the baby out with the bathwater, but I did stay awake last night dwelling on that very question, as the decision wasn't easily taken and I'm not at ease with it, it was a hard choice, but I believe it's the right one for what I can give to it... I've given over 28 years, there's a time to say that it's enough, the chilling effect of this legislation has changed the nature of what I was working on, and I don't accept these new conditions.
The vast majority of the risk can be realised by a single disgruntled user on a VPN from who knows where posting a lot of abuse material when I happen to not be paying attention (travelling for work and focusing on IRL things)... and then the consequences and liability comes. This isn't risk I'm in control of, that can be easily mitigated, the effort required is high, and everyone here knows you cannot solve social issues with technical solutions.
Thanks for all your work buro9! I've been an lfgss user for 15 years. This closure as a result of bureaucratic overreach is a great cultural loss to the world (I'm in Canada). The zany antics and banter of the London biking community provided me, and my contacts with which I have shared, many interesting thoughts, opinions, points of view, and memes, from the unique and authentic London local point of view.
LFGSS is more culturally relevant than the BBC!
Of course governments and regulations will fail realize what they have till it's gone.
> The vast majority of the risk can be realised by a single disgruntled user on a VPN from who knows where posting a lot of abuse material when I happen to not be paying attention (travelling for work and focusing on IRL things)... and then the consequences and liability comes. This isn't risk I'm in control of, that can be easily mitigated, the effort required is high, and everyone here knows you cannot solve social issues with technical solutions.
I bet you weren't the sole moderator of LFGSS. In any web forum I know, there is at least one moderator being online every day and much more senior members able to use a report function. I used to be a moderator for a much smaller forum and we had 4 to 5 moderators any time with some of them being among those that were online every day or almost every day.
I think a number of features/settings would be interesting for a forum software in 2025:
- desactivation of private messages: people can use instant messaging for that
- automatically blur post when report button is hit by a member (and by blur I mean replacing server side the full post by an image, not doing client side javascript).
- automatically blur posts when not seen by a member of the moderation or a "senior level or membership" past a certain period (6 or 12 hours for example)
- disallow new members to report and blur stuff, only people that are known good members
All this do not remove the bureaucracy of making the assessments/audits of the process mandated by the law but it should at least make forums moderable and have a modicum amount of security towards illegal/CSAM content.
Then you will see that a forum that allows user generated content, and isn't proactively moderated (approval prior to publishing, which would never work for even a small moderately busy forum of 50 people chatting)... will fall under "All Services" and "Multi-Risk Services".
This means I would be required to do all the following:
1. Individual accountable for illegal content safety duties and reporting and complaints duties
2. Written statements of responsibilities
3. Internal monitoring and assurance
4. Tracking evidence of new and increasing illegal harm
5. Code of conduct regarding protection of users from illegal harm
6. Compliance training
7. Having a content moderation function to review and assess suspected illegal content
8. Having a content moderation function that allows for the swift take down of illegal content
9. Setting internal content policies
10. Provision of materials to volunteers
11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM
12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs
...
the list goes on.
It is technical work, extra time, the inability to not constantly be on-call when I'm on vacation, the need for extra volunteers, training materials for volunteers, appeals processes for moderation (in addition to the flak one already receives for moderating), somehow removing accounts of proscribed organisations (who has this list, and how would I know if an account is affiliated?), etc, etc.
Bear in mind I am a sole volunteer, and that I have a challenging and very enjoyable day job that is actually my primary focus.
Running the forums is an extra-curricular volunteer thing, it's a thing that I do for the good it does... I don't do it for the "fun" of learning how to become a compliance officer, and to spend my evenings implementing what I know will be technically flawed efforts to scan for CSAM, and then involve time correcting those mistakes.
I really do not think I am throwing the baby out with the bathwater, but I did stay awake last night dwelling on that very question, as the decision wasn't easily taken and I'm not at ease with it, it was a hard choice, but I believe it's the right one for what I can give to it... I've given over 28 years, there's a time to say that it's enough, the chilling effect of this legislation has changed the nature of what I was working on, and I don't accept these new conditions.
The vast majority of the risk can be realised by a single disgruntled user on a VPN from who knows where posting a lot of abuse material when I happen to not be paying attention (travelling for work and focusing on IRL things)... and then the consequences and liability comes. This isn't risk I'm in control of, that can be easily mitigated, the effort required is high, and everyone here knows you cannot solve social issues with technical solutions.