• How exactly does this work? From skimming the article it seems like as long as you respond to people reporting horrendous behaviour and revenge porn, have an active moderation team, and complying with Ofcom request, it shouldn't change much? What exactly have I missed here - IE how does this add any more administrative overhead than say, responding to DMCA requests.

    Also, could the site be set up and owned by a Ltds so that if they did get fined, the risk is to the company not anyone personally?

    Terrorism
    Harassment, stalking, threats and abuse offences
    Coercive and controlling behaviour
    Hate offences
    Intimate image abuse
    Extreme pornography
    Child sexual exploitation and abuse
    Sexual exploitation of adults
    Unlawful immigration
    Human trafficking
    Fraud and financial offences
    Proceeds of crime
    Assisting or encouraging suicide
    Drugs and psychoactive substances
    Weapons offences (knives, firearms, and other weapons)
    Foreign interference
    Animal welfare

    Surely most of these things are against the forum rules anyway and any fines would be only levied at forums who repeatedly don't bother to try and take stuff down?

    I'm looking over some of the supporting document, most of the requirements seem to be only applicable to large providers, not small communities.

    The ones that do apply:

    Providers should have systems and processes designed
    to review and assess content the provider has reason
    to suspect may be illegal content (part of its ‘content
    moderation function’)

    This is just having a moderation system, surely?

    Providers should have systems and processes designed
    to swiftly take down illegal content and/or illegal
    content proxy of which they are aware (part of their
    ‘content moderation function’), unless it is currently
    not technically feasible for them to achieve this
    outcome.

    All forums have this anyway

    All providers of U2U and search services should have
    complaints systems and processes

    All forums have this anyway, maybe needs be formalised

    Providers should handle complaints about suspected
    illegal content in accordance with their content
    prioritisation processes a

    Pretty sure any forum admin would respond to PMs about this with haste.

    Without going through the full document it seems like if you have a small, moderated forum with responsive admins most of this shouldn't really affect you. I think the legislation is more targeted at large providers who allow shitloads of terrible stuff through without accountability, and smaller providers who don't really bother moderating their own content properly or ignore it when someone complains their ex uploaded nudes etc.

    I'm not sure forum software (i.e. microcosm) is really threatened by the act and as long as it provides moderation tools, surely it's up to the people that run the boards to be liable.

    It seems like an incredible amount of value could be lost, and perhaps reaching out to Ofcom directly to establish compliance and dialogue could be a better solution - I doubt very much that they're just going to hit people with huge fines without first trying to rectify the problem.

  • Text of the legislation unfortunately very unhelpful and far broader than what the summary doc you're reading says if you're coming at things from a risk minimisation perspective. Dialogue with ofcom may only do so much if you are technically in noncompliance because you simply don't have the resources, and even if you are trying to be compliant this act opens far more spurious heads of liability for people with a bone to pick to cause a lot of trouble which may not be entirely dependent on regulator action alone.

About