-
Risk should be minimal if the forum assembles a team of moderators who just kill everything out of line with fire
I thought about this just this morning.
If some programmers have time you could build wide-scale community moderation.
i.e.
- If you've been on the forum longer than n time and have posted more than x times... you're on the moderation team
- Anything reported creates a thread in a hidden forum, and a notification to all moderators something needs acting on
- No single moderator can do anything, a consensus would have to emerge, i.e this protects against a bad moderator, and also means in the messy cases the majority opinion wins... this is a "vote for ban", "vote for delete", "vote for shadowban" type thing... whichever gets the most votes on some scale would be auto-applied
This kind of thing would mean you don't have to have one or two, or even 10, people be a moderator... you'd have hundreds immediately.
The tuning of "default actions after x time" would be something the site admin does to reduce the risk to Directors... but otherwise you could groupthink the moderation and diffuse it widely.
- If you've been on the forum longer than n time and have posted more than x times... you're on the moderation team
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
My thinking there was to put people on others block lists if they are being annoying, so they’re allowed to say legal things, but no logged in user can see them
We have global ignores, shadow bans, individually controllable ignores, etc.
If we applied them to everyone who is annoying, it is the whole site... it's human to have a bad day.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
A real World example some will recall from a decade or so ago:
- A person was active on the forum, built bikes, took photos, seemed to be an entirely normal user
- A few of that person's posts were a little hard to believe, but that's something that can be said about virtually everyone's posts
- We learn that they were involved in a road traffic incident, and later that they died
- Their girlfriend registered and shared this news and starts fund-raising for funeral costs
- A couple of people get suspicious because of similarity in language used, nothing more, and I investigate
- I find that the "girlfriend" is the person in question via some internet sleuthing, and track down their address, mother's address (the person is young adult, late teens but still an adult), where they're working (thank you LinkedIn)... they had "faked" their death, and were now fundraising their own funeral
- I share this news on the forum, and the person is shamed into apologising
- A bit of a bundle-on ensues, because people are angry, the person leaves the site
The situation ended fine, they left the site, life goes on.
Let's consider which parts of the OSA that breaks...
- Fraud? Yes, attempted, only discovered by language analysis by some other people on here
- Vulnerable person? Yes, that person was clearly struggling with their own mental health issues and at risk
- Stalking? Probably, a few individuals took it on themselves to follow everything, make themselves known
- Bullying? Yes, the reaction of the community to this was not mild / polite
- Harassment? Probably, because of the community reaction, especially a few individuals who were quite angry at having been duped
Now consider what might have happened, if that person, in their vulnerable state, and under the pressure of shame from the response to their actions, had then actually ended their life as they claimed that they had... a thing that was threatened by them at the time.
We were lucky... extremely lucky.
The actions of many, from a third party reasonable viewpoint, and each interaction taken in isolation without the larger context... can very easily have been read as having breached the OSA to a severe degree and in multiple ways.
We were unlucky that person attached to this site, we were lucky to have it detected (and none of the OSA tools would've), and we were lucky it ended fine. If in fact that person had ended their life, I'd be carrying that still... I don't need the OSA risk to be there to try and do the right thing, but likewise... none of what the OSA covers will protect against the bad luck of how a scenario plays out.
- A person was active on the forum, built bikes, took photos, seemed to be an entirely normal user
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
They are very proactive on moderation
A few people have responded here and elsewhere that this is somehow a protective thing... it's a form of "If you've done nothing wrong, you've nothing to hide"... we're very proactive on moderation, I swing the ban hammer freely and nuke everything someone says when I do so, and I also follow up all reports, and this software keeps every version of what someone has said so I can see when they've done shit and attempted to cover their tracks.
I don't buy it though, the Act has parts that talk about the duration of time that "harmful" (not illegal) content is visible, and that harm is effectively in the eye of the beholder.
With all best intents, if you don't have a 24/7 coverage of moderation and surety that when you go on vacation that someone else is covering it... this is the risk of weaponisation.
Those who have run forums for a long time have seen "fun" forum invasions, or the guy you banned register 10 accounts and start spouting off all over the place, or the "hilarity" of someone writing a crappy bit of JavaScript that does something nefarious... it's just a question of time even when it hasn't happened recently.
My fear, which I think is reasonable, is that the law actively encourages retribution by those who are moderated / banned or just trolls... because where their act of retribution was always futile in the past, it now carries real consequences for those who are named as the "officers" for the site (typically the individuals running it).
In that World, it basically incentivises the aggrieved to find their moment and act.
These asshats that we ban are apparently more creative than I, but I can think of many ways to weaponise the OSA based on what I've seen in the past.
The idea that "no politics" and "proactive moderation" and "we are fine we are superb at running sites" is going to save site operators feels to me like hubris. A lot of sites will never have action taken against them because of luck, pure luck, but a small minority will because of bad luck, that they just had the one angry troll, the one really vulnerable person, the couple of people who bully another, the few that share effed up material via DMs... hubris is not going to help you if you are the unlucky site where someone does commit suicide due to the behaviour of others, or some act of harm happens due to some subtle act of bullying that didn't break any number of well intentioned strict rules, this is out of our control.
A long time ago I ran sites in a zero tolerance way, you couldn't even tell someone to fuck off... and all that happened is the bullying, harassing, all the bad stuff the OSA targets... moves to plausibly deniable territory, soft words all deniable, by a cohort of people who are friends and effectively dislike someone else.
I've found the current style works better... I let you all swear at each other, it actually provides far better signal to me as a moderator and site admin, that I get to see it and then act on it.
In my experience, the more proactive the moderation, the more strict the rules, the more it just exists but all flies below the radar level of your rules... the outcomes remain the same. That sites haven't experienced the bad outcomes so far is little to do with their moderation, and more to do with luck.
-
It occurs to me this morning that even assuming the community is held together via the life raft of Discord, that there are reasons to save this technology and to keep using it.
Namely:
- This website does not track you, snitch on you, leak your data.
- It is privacy conscious, to the extreme of scrubbing outbound links to prevent tracking.
- It is privacy conscious, to the extreme of only embedding things (like YouTube) in a no cookie / non-tracking way.
- It has the best UX/UI of any forum or community website, in large part due to simplicity, font choice, size of fonts... and just keeping things so simple and plain that the web browser you choose gives you control over how it appears.
So... even assuming the community stays together, even looking at Discord as the best of the bunch... this technology is pretty good.
This post and thread then... a recipe for how to do this the legal way.
For this recipe, you will need (at least):
- 1 x Company Director, establish a CIC, and this person takes the standard responsibilities of a company director, as well as being the "officer" for the Online Safety Act... you take all the liability and risk... it's your job to make sure papers are filed, and truly your job is to reduce your risk.
- 1 x Secretary / Treasurer, you need a place to receive donations, reimburse bills, which is likely Open Collective, and you need to ensure that the company paperwork is in order including AGM and meeting notes, records of decisions, etc.
- 1 x Tech people, this is essentially the person who reboots servers, manages the infra, tweaks code if needed
- 2 x Moderators, you need two to ensure you can respond in a timely way when the other is not available, you may need three or more
That is the list of critical roles that you absolutely must fill.
Additionally, nice-to-haves:
- 1 x Programmer, because you will need to build some new moderation tools and no-one else is coming to do it... if there can be 3-4 of these people you stand a better chance at getting it done.
That's what I've been doing, all of that... though I gave up on running a CIC when the donations were so low that it was just an overhead expense and the liability didn't scare me as the risk was so low... the risk is not low now, and the liability greater... so yes you'd want a CIC.
I'd still be tempted, if I were you, to move the hosting to Germany, as it's slightly cheaper and because the object storage (attachments, files, etc) are located there it would actually make the site more performant.
I'd also still be tempted, if I were you, to complete the work I started on rewriting the frontend in Go... it would reduce your operational costs further, and make it really trivial to deploy the site anywhere and to update it in future... and more to the point, you're going to need to build moderation tools and the old Django is going to make this very hard, but having all the code in Go will make it significantly easier.
Then you finally need to access the braintank of legal support on here (it's pro-bono, forumengers who are very talented and qualified, but no-one can give indemnity or underwrite legal advice given pro-bono) to help with the OSA risk assessment, compliance docs.
After the risk assessment is done, it will tell you what you need to do to mitigate risk, we are likely "Medium risk" and "Multi-risk", all forums are, so we will be obliged to implement technical solutions to social problems, such as scanning of DMs, URLs, and file uploads, and providing a better reporting button and things like "hide the post automatically if not moderated within n hours"... those tools also need defences, don't underestimate the likelihood of an automated "report every comment" triggering the hiding of all comments on the whole site, so you need to detect bad actors in the moderation process too, and you probably need more queries / reports to find stalkers / harassers, i.e. "find the top people who replied to person A" so that if person A reports being stalked / harassed online that you can quickly verify with data whether that appears to be the case... to mitigate risk requires some tech work.
This is the legal way :) It can be done, in this scenario I'd help as an advisor only, may touch code and servers when needed, but would no longer do the Director, Treasurer, Moderator roles... and only at the very most I'd be a faceless techie to help in the worst incidents based on my experience, nothing more, likely less.
- This website does not track you, snitch on you, leak your data.
-
Be clear what this involves, this is one of two roles that is critical.
It is this role that takes the liability and manages the risk, this role that makes your name known, that can be correlated with LinkedIn, etc.
I'd love to see this work and all the roles filled, but I believe it only works long term if the people stepping forward do so eyes wide open.
As an officer of the company you get the liability of the online safety act, and you get the responsibilities of a company director... Meaning clean up and closing the company are yours, filling accounts ultimately is yours, etc.
The tasks are simple, but it is responsibility.
If people step forward eyes wide open then it is a great thing and offers the possibility of this working out.
The other key role is the secretary/treasurer role, which is whom I assume will be the name on the open collective for fundraising, and who will pay the bills, and ensure the company has meetings and notes are taken. This also requires name and address to be known and visible in places due to anti money laundering laws.
-
-
sorry if I'm a bit slow, but what thread?
Pinned to the front page, but https://www.lfgss.com/conversations/401475/
-
Discord is no good as a replacement for this place. To the extent that it works at all, it's for ephemeral chat while people find other people to join a session in a multiplayer game.
Whilst I don't disagree, I have the server stats and cache hit stats that show this place was nearly always "stuff posted in the last week", and that very seldom did any person look at stuff older than that. In fact, it happened so seldom that my LRU caches would be empty of that content when a crawler came around, which is why the crawlers impacted us so much. The caches are big, the content just expired out.
Ephemeral chat is the bulk of what people do... though of course it's amazing to search for something and pull up an old thread that answers it perfectly, or to cite a joke from a decade ago, etc... it just wasn't done that often.
I'm sure you'll anecdotally say you do it constantly... that's great, but the vast majority didn't.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
where an online community was used to bring perpetrators together
Illegal stuff will always happen, it just goes underground. Bad stuff will still happen when the OSA is being enforced.
The OSA does not stop bad stuff happening... it just causes a chilling effect where non-compliance of the good stuff triggers shutdowns, and consolidates everything into the hands of big tech.
Know that this is what Ofcom and the politicians desire as they believe that makes it easier to regulate and control the communications then.
The bit of me that thinks "fuck it, put it in another country and make it an anarchist collective and a maze of volunteers in multiple countries"... is the bit of me that is angry that this is happening.
But there's a lot of realisation too that this is Cnut the Great sitting on a throne on a beach and trying to turn back the tide... I have a happy and busy life, letting go is the best thing to do. Let Discord figure out the compliance.
-
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
Where my head is at, if there's no equivalent to a "this only applies to entities with global revenue above £25M" or something like that... and there remains liability for me as described in the OSA... I'm out, I'm done.
I think personally that the best thing to do is preserve this place, and make the life raft (Discord) work.
I simply don't think this is enough of a vote winning issue that politicians care, and the media have so long run stories on the theme of "Won't somebody think of the children"... that the chances of a carve-out or U-turn are vanishingly small.
Additionally... I find thinking of the scenario of a U-turn damaging still... a lot of the damage has been done, we know we need to shutter, people are already trying to make other places work... if a reprieve came in the 11th hour, what would survive in the many communities facing this will be a shadow of their former selves.
I can say "Yes, I'd carry on"... but it feels like that weird conversation, "Would you still love me if I was a worm?"... it's never going to happen, so I can say yes, but that reply is glib as that reality doesn't exist.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
The fallout of this bill and what it is doing to sites like this one is going to be a literal death knoll for a lot of people. Chronic loneliness and lack of community is such a massive trigger for suicide in men. Rates of which are disgustingly high already given the state of our mental health services and any support for young people to try and get in front of any issues. Taking away what is likely the only lifeline for them to be open or discuss something they share and interest in, is going to fucking suck, hard. I focus on men here as it is very likely that a lot of these fora are probably strongly weighted in that way, not that only men will be affected.
Thank you for writing this... this is why I'd continued all this time, it worked like this for me as well. But thank you, for writing this.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
Will the site be archived and able to view in its final form? It will it be "the true death"?
True death... to run the site read-only will have ongoing costs, but I cannot expect this to come with future donations.
I'm open to someone crawling the public visible site in it's totality and making a workable archive that looks like this.
You should know that the Archive Team https://wiki.archiveteam.org/ are already crawling us, and we are on their Deathwatch https://wiki.archiveteam.org/index.php/Deathwatch#2025 and that I've said I will add them to the allow list and ensure they can fully archive us. I don't know today how that will be accessed.
I am fine saying to the people on here that if you want to run a recursive wget, feel free to do so.
You should archive everything that hits https://www.lfgss.com and all files / attachments that come from https://lfgss.microcosm.app/ . The final archive will be somewhere between 1-2TB.
Actually, scratch this... I'll do it and will make a torrent available.
-
LFGSS and Microcosm shutting down 16th March 2025 (the day before the Online Safety Act is enforced)
Honestly, spoke to a political editor at a major publication yesterday and she essentially distilled this down to being the end of "safe harbor" and "mere conduit", that she believes my interpretation to be sound.
She was supportive of providing contacts and intros, or running a story, but what struck me was her own conclusion that she would not wish to be a small site operator in this environment.
I'll continue to fight it, in part because if I succeed in any small way it will help all small sites, but for me personally I am committed to not being involved after 16th March.
I believe too deeply that the risk is not low and the liability is high, combine that with transitioning and I am too visible and too much a target for this stuff being weaponised by trolls/TERFs and others.
I still personally believe that the Discord life raft is probably the wisest thing to put effort into, but to enjoy this and other sites for as long as we can.
A few people here never buggered off 🤣