p.s the difference between this post and this post is that I want to ask questions and get people's opinions and answers in this one more.
Personally, I find this idea almost terrifying because it implies social media in it's current form cannot be fixed by changing or expanding human or automoderation, nor fact checking, because moderation can't reasonably occur at scale at all.
However, I have 2 questions:
1: If large social media platforms can't really be moderated what should we do to them? The implied solution is balkanizing social media until the 'platforms' are extended social circles which can be moderated and have good discussion (or more practically, integrate them to a federated service like mastodon which is made to be split like this or something like discord.) An alternative I've heard is to redo the early 2000s and have fanforums for everything to avoid context collapse and have something gluing the site's users together (something I am far more supportive of) or a reason for invite systems and stricter control of who enters your site but doesn't explain the idea that once your site hits a certain usercount, it will inevitably worsen and that is something that stems from human nature (Dunbar's number aka the max amount of friends you could theoretically have) and so is inevitable, almost natural.
2: Why is moderation impossible to do well at large scales? While I think moderation, which I think is analogous to law enforcement or legal systems (though the many reddit mods here can definitely give their opinions on that) definitely likely isn't the kind of thing that can be done at a profit, I'm not entirely sure why would it be wholly impossible. A reason I've heard is that moderators need to understand the communities they're moderating, but I'm not sure why wouldn't that be a requirement, or why would adding more mods make that worse (mods disagreeing with eachother while moderating seems quite likely but unrelated to this.)