-
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
Classifieds includes risks of fraud, risk of handling stolen goods, in addition to the other risks outlined in the guidance.
Remember that illegal isn't the threshold, "legal but harmful" is the threshold, and the Act says nothing that requires the people involved in the sale to be the ones who were harmed, a reader can be harmed.
-
-
Patreon
I researched a lot of the providers, and Patreon would've worked for us... but, it really only works for a single person.
Open Collective is by far the better option here, it allows multiple people to be on the account and to let different people pay different bills and still be reimbursed and have it done transparently.
There is only a single reason I didn't do Open Collective sooner... you can't migrate subscriptions from PayPal.
But I will cancel all subscriptions, so if this group has their stuff together and are committed to continuing to run LFGSS (and a few of the other sites like Espruino or PignoleFixe)... then start over afresh on OpenCollective whilst you have the goodwill and enthusiasm of people to get the place into a good state, and whilst it has the extreme transparency of showing how it all adds up.
-
-
You might not need it... Open Collective shows the list of donors and gives you the option to declare that or not, and further Open Collective makes it clear if the tank is full or empty... I should've built that transparency sooner as people would likely top it up when they saw it was empty.
But implementing badges and the like, will take more technical integration, and produce a little rift between the haves and have nots... I always like the "pay what you feel it's worth, according to your means"... and largely that does work.
But warm fuzzies come from being listed on the Open Collective page... and some people absolutely do prefer anonymity to donations, so bear in mind that if you add badges that some will want to hide them.
-
What I guess we need is an image file that folks can take to the studio of their choice in their area
I've got the vector and other format of the logo as per the top left header.
It was originally designed by Big Daddy Wayne, in fact this whole site looks good because of Wayne.
Line work of that could easily be incorporated into other tattoos, which would give people personalised options
-
I'm sure that @velocio has a few trusted friends on here he could ask to review applications.
they.
This community knows who the good people are, I consciously chose to have very few moderators years ago... just @hippy and I, and then a few people for the Ladies section and Polo section.
IMHO you want as few as possible to satisfy yourself that you never have a gap in moderation (coverage across time and when people go on vacation, etc), and the criteria is something like:
- Kind by default
- Virtually never takes things personal, or at least remains very unbiased despite people being asshats
- Patience of a saint
- No desire for power (I've found the best mods are those who never ask for it, hippy certainly didn't)
- Decisive, even if the decision is not to act
Things like consistency come from experience, but also consistency can be gamed... smart people who are trying to bully someone learn where the line is and go right up to it, I try not to make a fixed line, it moves at times, what is acceptable is a fluid thing... or at least, it was before the OSA, you might all deem it to be a very fixed risk averse thing now.
- Kind by default
-
-
-
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
I am always staggered by the fact that @Velocio has kept up this place ad free never monetized
I'll out myself as an anti-capitalist... Or rather I have a fundamental belief in privacy, that for us to have a democracy we require privacy at it's core, which means virtually no surveillance, etc.
I just couldn't bring myself to build what the rest of the internet has become. Some of what this law represents is that, from my perspective. My rejection of this is on so many fronts.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
This is a nice example of how - regulation is good actually. Absolutely massive fines for Meta, Google, Amazon etc. Actually confronting these corporate behemoths who act like they are above the law and fuck us all over, all the time, normally with impunity.
Oh gosh yes, I cannot wait for X to be in the firing line of this law, and I am here for that.
The law should actually encapsulate the spirit of "we only go over those whose negligence carries the most widespread consequence", and enshrine a "this act applies to all whose global annual turnover is 1,000x the UK national average salary" (this phrasing allows the threshold to move over time... today the national average salary is about £36k, meaning the threshold would be £36M... which covers every major company, but excludes all small voluntary, sports clubs, CIC, small business, small charities, etc.
X... holy crap, this law should just block X from operating in the UK and block their domain and IPs.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
Everything you say makes sense... but it's not what the Online Safety Act actually says.
And if we're quoting Ofcom, then they also ominously state that what they published on Dec 16th is "This is just the beginning...": https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/time-for-tech-firms-to-act-uk-online-safety-regulation-comes-into-force/
This is just the beginning
This first set of codes and guidance, which sets up the enforceable regime, is a firm foundation on which to build. In light of the helpful responses we received to our consultation, we are already working towards an additional consultation on further codes measures in Spring 2025. This will include proposals in the following areas:
- blocking the accounts of those found to have shared CSAM;
- use of AI to tackle illegal harms, including CSAM;
- use of hash-matching to prevent the sharing of non-consensual intimate imagery and terrorist content; and
- crisis response protocols for emergency events (such as last summer’s riots).
And the Ofcom head herself, on that page, stating:
Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.
You've focused on a single aspect of risk, and it's the one that is trivial to mitigate... liability of an entity can be solved by setting up a CIC.
But you didn't focus on the risk of criminal liability faced by officers of the entity, which breaks the Ltd structure of a company or a CIC... and for which, if you get arrested you no longer can travel visa-free to the USA or Europe under ESTA / ETIAS... it's an admittedly small risk, but in the last 12 months I've travelled to 17 cities over 21 trips (some cities obviously more than once)... so merely interacting with the law is career ending in my case.
You also didn't focus on the cost of compliance which I outlined in the first post, the cost of legal compliance, technical compliance, training materials, response time, etc, etc... which as the sole person who has been doing this for so long, it's a cliff face of effort to climb. Some of the technical compliance speaks of requirements to use tools that do not yet exist, but I can guarantee won't be open source and free to use, i.e. age verification, AI content scanning, etc... so a real World increase of the cost to operate too (needing more donations).
I outline in the first post the two key documents, published by Ofcom on the 16th December, and called out the specific parts of their own guidance that would describe this site and other fora as "Medium risk" (IANAL, that's my interpretation) and a "Multi-risk service" (IANAL, just experience of being a site admin)... those costs I outline appear to be very real.
The risk is arguably low, but the costs of compliance are high, and presumably if something harmful did occur and we were not compliant then the risk is much much higher.
And remember, this isn't just illegal content, we've always been very good at moderating that as every other forum admin I've ever interacted with... this is "harmful but not illegal" content... and the descriptions of that are vague, are subjective.
A page or two ago I gave a real-World example of an event that happened on this site when a person who could easily be argued to be vulnerable, faked their own death and attempt to fund-raise for their own funeral... this ticks so many of the "harmful but not illegal" boxes, and I have many examples over the years of such things... that example sure, I chose it because people here likely recall it and it ticked all the boxes at once, but so many examples tick a box or two almost daily.
Your words are reassuring, but don't begin to cover the complexity of what the law is, what the guidance is, what the real World experience of moderating a forum is, and the personal circumstances of the admins who run forums.
Unfortunately when I've been appalled at laws in the past and seen them introduced under the promise that they would never be abused, in the UK we so often can cite hundreds of examples of them being abused or mishandled due to incompetence. From councils using terrorism laws to refuse disabled parking permits, through to scandals like the Post Office subpostmaster thing which shows how the slow wheels of bureaucracy just do not care. As a trans person dealing with the health system I can see how malicious a Govt system can be, and I cannot imagine how badly Ofcom will handle a case when it arises - the argument of "they are understaffed so will only focus on the big tech" rings empty to me, I feel they'll just do everything badly, and badly here has consequences for me. The unimaginable happens... it's a failure of imagination to believe that they do not.
Your tolerance threshold for risk is different from mine, you aren't the one accepting liability here or these risks, it seems like you have not seen these things that occur on the internet daily, and you'll forgive me for having the imagination to believe that these risks, for me, are real, as are the costs of compliance.
- blocking the accounts of those found to have shared CSAM;
-
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
It's old, and not currently suited to the legal issues of the day.
One could imagine ActivityPub being made a backend to the forum, and the forum being "everyone runs a local node and it talks to each other to create the whole", which is a bit like how email servers used to be.
Many parts of this forum platform were actually designed to enable multiple fora to share things, to seed new communities from the parts of others, because the one of the natural lifecycles of a community is that it grows and schisms, and it's not a bad thing to embrace that.
It's surprising how much of this tech stack could, with a degree of effort, be plugged into GoToSocial and be fediverse native... but anyone running an instance that isn't a single-person instance still comes under the OSA. Perhaps you no longer care as it would create a million small websites rather than just 50-100k larger ones... too large a surface area for regulation... but it should be noted that fediverse sites are still subject to the regulation.
It's interesting that it's the money that becomes the issue... because when a site is large enough to require donations or monetisation in another form... it can be controlled, you can't hide it, push it underground, throw it behind Tor... and money in the modern world is very trackable. When things are small, and when they're hosted all over the place... very very hard to apply regulation to them.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
I'm not a troll, but I've said my piece and I don't wish to outstay my welcome. I'll move on. I enjoyed the opportunity to write about my passion. Thank you. And thank you for running an independent forum for 16 years. So long, and thanks for all the fish.
Truly hard to tell as I'm sure you'll appreciate: no username, no cited links to support what you say, anonymous / masked email... and some of your phrasing is more ad-hominem attack, or at least felt that way to me.
I respect your opinion too, some of your points are valid, but you also state yours with personal circumstances shaping your risk assessment, and that too is very valid, and of course individuals run these services and individuals all have very different personal circumstances and risk thresholds that change over time.
I might argue my risk thresholds have remained static, but that is likely a fallacy, when I was younger, fresh off the streets (rough sleeper) and hacking together forums... I had nothing to lose and everything (a chosen family) to gain... I could take all risks and laugh them off and hadn't yet experienced things that would give me reason to pause or that would suggest I was naive. Now I am older, and I have far more to lose, and far less to gain, and running so many forums I feel like I've seen it all online, and experienced a lot personally... my risk threshold has changed.
To me though, the OSA feels less like a straw on the camel's back... the GDPR felt like a straw as being so privacy conscious and storing so little that was trivial to comply with... but this feels more like a bale of straws. You likely (obviously?) disagree, that is your opinion and is shaped by your risk assessment, but for me it is a bale of straws, not a minor thing, a major thing... and from that perspective my decision is rational and reasonable, as are similar decisions taken by other forum admins.
I still love the tech side, still love the people side, am happy in life, but this for me is a lot that I would not and do not choose to sign up to... if it falls within the risk threshold of others to take the baton, great... good for them... but I've asked for volunteers in the past and none materialised and in that light, with a clock now ticking... shuttering gives the greatest notice period possible for finding another place... jumping in the Discord life raft... or maybe, just maybe, a team figuring out how to organise and take over.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
Would've helped if Boing Boing linked to https://www.lfgss.com/conversations/401475/
-
It's possible to hide the backend, and have visible public access that doesn't require hiding.
The law doesn't apply to proxies and ISPs, only to the platforms themselves and those who run the platforms.
One could easily envision multiple domain names all being a proxy to a backend, and those proxies don't have to be hidden, and they wouldn't know where the actual platform was, they could proxy by listening rather than forwarding, and then "mere conduit" applies again.