• Everything you say makes sense... but it's not what the Online Safety Act actually says.

    And if we're quoting Ofcom, then they also ominously state that what they published on Dec 16th is "This is just the beginning...": https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/time-for-tech-firms-to-act-uk-online-safety-regulation-comes-into-force/

    This is just the beginning

    This first set of codes and guidance, which sets up the enforceable regime, is a firm foundation on which to build. In light of the helpful responses we received to our consultation, we are already working towards an additional consultation on further codes measures in Spring 2025. This will include proposals in the following areas:

    • blocking the accounts of those found to have shared CSAM;
    • use of AI to tackle illegal harms, including CSAM;
    • use of hash-matching to prevent the sharing of non-consensual intimate imagery and terrorist content; and
    • crisis response protocols for emergency events (such as last summer’s riots).

    And the Ofcom head herself, on that page, stating:

    Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.

    You've focused on a single aspect of risk, and it's the one that is trivial to mitigate... liability of an entity can be solved by setting up a CIC.

    But you didn't focus on the risk of criminal liability faced by officers of the entity, which breaks the Ltd structure of a company or a CIC... and for which, if you get arrested you no longer can travel visa-free to the USA or Europe under ESTA / ETIAS... it's an admittedly small risk, but in the last 12 months I've travelled to 17 cities over 21 trips (some cities obviously more than once)... so merely interacting with the law is career ending in my case.

    You also didn't focus on the cost of compliance which I outlined in the first post, the cost of legal compliance, technical compliance, training materials, response time, etc, etc... which as the sole person who has been doing this for so long, it's a cliff face of effort to climb. Some of the technical compliance speaks of requirements to use tools that do not yet exist, but I can guarantee won't be open source and free to use, i.e. age verification, AI content scanning, etc... so a real World increase of the cost to operate too (needing more donations).

    I outline in the first post the two key documents, published by Ofcom on the 16th December, and called out the specific parts of their own guidance that would describe this site and other fora as "Medium risk" (IANAL, that's my interpretation) and a "Multi-risk service" (IANAL, just experience of being a site admin)... those costs I outline appear to be very real.

    The risk is arguably low, but the costs of compliance are high, and presumably if something harmful did occur and we were not compliant then the risk is much much higher.

    And remember, this isn't just illegal content, we've always been very good at moderating that as every other forum admin I've ever interacted with... this is "harmful but not illegal" content... and the descriptions of that are vague, are subjective.

    A page or two ago I gave a real-World example of an event that happened on this site when a person who could easily be argued to be vulnerable, faked their own death and attempt to fund-raise for their own funeral... this ticks so many of the "harmful but not illegal" boxes, and I have many examples over the years of such things... that example sure, I chose it because people here likely recall it and it ticked all the boxes at once, but so many examples tick a box or two almost daily.

    Your words are reassuring, but don't begin to cover the complexity of what the law is, what the guidance is, what the real World experience of moderating a forum is, and the personal circumstances of the admins who run forums.

    Unfortunately when I've been appalled at laws in the past and seen them introduced under the promise that they would never be abused, in the UK we so often can cite hundreds of examples of them being abused or mishandled due to incompetence. From councils using terrorism laws to refuse disabled parking permits, through to scandals like the Post Office subpostmaster thing which shows how the slow wheels of bureaucracy just do not care. As a trans person dealing with the health system I can see how malicious a Govt system can be, and I cannot imagine how badly Ofcom will handle a case when it arises - the argument of "they are understaffed so will only focus on the big tech" rings empty to me, I feel they'll just do everything badly, and badly here has consequences for me. The unimaginable happens... it's a failure of imagination to believe that they do not.

    Your tolerance threshold for risk is different from mine, you aren't the one accepting liability here or these risks, it seems like you have not seen these things that occur on the internet daily, and you'll forgive me for having the imagination to believe that these risks, for me, are real, as are the costs of compliance.

  • I'm not someone who has ever used this forum, but I am very much against this bill and came here from discussions elsewhere.

    Have you perhaps considered contacting Ofcom to clarify about this uncertainty? Or maybe your local MP?

About

Avatar for user159384 @user159384 started