Once again we see federation doesn't solve our problems. That shouldn't surprise anyone because we've had federation since the earliest days of the internet with usenet, irc, and email, yet the...
Once again we see federation doesn't solve our problems. That shouldn't surprise anyone because we've had federation since the earliest days of the internet with usenet, irc, and email, yet the core tech of the early internet succumbed to the same problems. I'd go so far as to say federation and moderation (or more appropriately, self-governance systems) are separate problem domains without a lot of overlap.
Let's take a look at a fourth solution that the author hasn't thought of yet - but we have.
Get all of your userbase in on the moderation, helping out in whatever places they frequent. Make sure the power comes from democratic action in-aggregate, instead of from a small group of powerful accounts. This keeps it more honest and reliable because something has to get a rise out of a lot more people to get them to tag/report enough that it becomes a real issue.
Now let's have those reports reviewed by the mod tier users, hopefully several thousand of them per large group. If a batch of reports like that all coming in for the same incident is bullshit abuse (as it clearly was in this instance imo) then the accounts who abused that report feature have their reports penalized in the future, perhaps even losing access to the report system itself. It might be wise to restrict the maximum reporting capability of any single user as well, since having fewer reports to hand out would probably make people use them more judiciously and responsibly. I think just letting everyone spam report non-stop is a very bad idea - we've seen how that works on reddit.
Conversely, when reports are used well and validated by the trusted group's users, those doing the reporting have their reports prioritized or rewarded with greater access and greater trust. This should certainly feed into their rep with regard to becoming a future moderator-level user in those groups where they do good reporting.
New users are prevented from using this system for some time period, likely several months, so they have time to learn the community norms before being asked to enforce them.
People can also opt-out of all 'governance' activity in their profiles and never be bothered with moderation features. I wonder, would it be better to opt-in everyone by default, or opt-out everyone by default, and leave the governance only to the people who ask for it? Those two choices will create two very different groups of moderators, and I wonder which one is better...
IMO the key to that whole system is "Trust users but punish abusers" though. Even "in aggregate" and having actions be "democratic" and "verified by trusted users", the systems can still be...
IMO the key to that whole system is "Trust users but punish abusers" though. Even "in aggregate" and having actions be "democratic" and "verified by trusted users", the systems can still be subverted to commit unethical/unfair acts like targeted harassment or even just simply fall prey to tyranny of the majority. Without some form of trustworthy central authority to oversee the system and punish those who abuse it, it can quickly fall apart. This is one thing that Federation renders pretty much impossible too, which is why I think rather than solving the issues of the current social media systems (like witch hunts, doxxing and targeted harassment), it will actually make them much, much worse... as we basically just saw happen to Wil Wheaton who had no real choice but to flee the instance and ultimately chose to abandon mastodon entirely.
If it's done in the open with auditing, any unethical/unfair mob is going to be exposed and their actions reversed, without the need for a central authority - which is critical, because no central...
If it's done in the open with auditing, any unethical/unfair mob is going to be exposed and their actions reversed, without the need for a central authority - which is critical, because no central authority can handle the workload of vetting everything. If it isn't widely crowdsourced and still relies on paid staff or artificial-stupid algorithms, it's doomed in the long run. It has to be self-sustaining.
There is a role for a central authority - setting policy for what is and is not acceptable behavior. The central authority can also step in when the trusted group fails at moderating. That's where they are needed - they watch the watchmen, fix things when the system hits an edge case and breaks down. It's a much smaller number of people to oversee (modding the mods) and that might be doable with a paid staff, as long as it's good two-way communication all the time.
As for the tyranny of the majority, that problem isn't getting solved until someone invents something better than democracy. Limiting the number of reports from a single user, preventing the majority from seeing actions in-progress so they don't bandwagon, and making examples of hostile groups of users in very public fashion should all help reduce it. So will limiting the 'majority' who can act to that group's own regular users - everyone else is in drive-by mode.
The users have to handle it all, the vast majority of the time. That's the only way to get enough man-hours into the system to get the work done. I like the mindset where anyone reading a thread is responsible for moderating it by the simple virtue of being there.
We'll get to take another shot at that once we turn comment tagging back on here. That'll get us started. Should be fun! :D
Reversed by who, if not a central authority? Even with a significant trusted user pool there is still the distinct possibility that the majority of them will agree with the unethical behavior,...
any unethical/unfair mob is going to be exposed and their actions reversed, without the need for a central authority
Reversed by who, if not a central authority? Even with a significant trusted user pool there is still the distinct possibility that the majority of them will agree with the unethical behavior, especially so in smaller niche groups... which is what I meant by tyranny of the majority. All open auditing allows for is that unethical behavior to be discovered, but you still need someone in authority to punish them and reverse their actions. Not everything can be automated or democratized, nor should it be... punishment especially since that's where mob rule comes in to effect and sensibility/objectivity often goes out the window.
Reversed by the group's trusted users. If there's a mod team of thousands, and a clique of them start some shit that's in violation, the rest will notice and/or be notified by the regular users,...
Reversed by the group's trusted users. If there's a mod team of thousands, and a clique of them start some shit that's in violation, the rest will notice and/or be notified by the regular users, and are free to act on it to police their own mod team. In the rare event that most of those people all lose their shit at the same time over the same issue, that's when the admins can step in.
So... no matter how decentralised and crowd-sourced the moderation process is, there still needs to be someone (or some ones) at the top of the pyramid with ultimate power. You've said it...
So... no matter how decentralised and crowd-sourced the moderation process is, there still needs to be someone (or some ones) at the top of the pyramid with ultimate power. You've said it explicitly:
There is a role for a central authority - setting policy for what is and is not acceptable behavior. The central authority can also step in when the trusted group fails at moderating.
You're effectively agreeing with @cfabbro. You might differ about the details, but you both agree with the concept that there has to be a central authority at the top of the pyramid reviewing everything (that authority might be a single person, or a small group of people, but it still exists as a central authority). There'll probably be a few layers of moderator-types in this model, from tag-editors up to ban-masters, but every moderation action is being watched by everyone, and each level of moderators is being overseen by the level of moderators above it, which is in turn being overseen by the level of moderators above it, and so on. And, at the top of the pyramid sits Deimos.
Acolytes are overseen by wizards, who are in turn overseen by sorcerors, who are in turn overseen by mages, who are in turn overseen by archmages, who are in turn overseen by demigods, who are in turn overseen by lesser gods, who are in turn overseen by The God of Terror Himself.
Yep. I think where we disagree is how often the top levels get involved. Every step up that tree should be accompanied by a much greater distance between incidents. In your example, wizards doing...
Yep. I think where we disagree is how often the top levels get involved. Every step up that tree should be accompanied by a much greater distance between incidents. In your example, wizards doing 75% of the work and the god of terror getting involved like once a year sounds about right to me.
We have a bad tendency to dismiss issues by just pushing them up the chain and saying 'let the mods handle it' which isn't solving anything, it's just sweeping the problems under the rug. Push all the work up the chain and the people at the top end up unable to cope.
Most of the people and man hours are towards the bottom of the pyramid, so most of the moderation work has to get done at the bottom. I'll be pushing for that with every feature we come up with. The closer to the bottom and the more people who can get involved in any given system, the better that system is.
I tend to agree. I also think the wizards and the demigods are going to be different types of work. I don't imagine demigods will spend a lot of time moving topics to appropriate groups, for...
Yep. I think where we disagree is how often the top levels get involved. Every step up that tree should be accompanied by a much greater distance between incidents. In your example, wizards doing 75% of the work and the god of terror getting involved like once a year sounds about right to me.
I tend to agree. I also think the wizards and the demigods are going to be different types of work. I don't imagine demigods will spend a lot of time moving topics to appropriate groups, for instance. They'll be more focussed on what groups and sub-groups should exist, so the wizards can move topics accordingly.
The closer to the bottom and the more people who can get involved in any given system, the better that system is.
However, there is still the possibility of a tyranny of the majority. In our case, we hope the majority will be good wizards and mages and demigods practising white magicks, in which case we'll want them to have control and exert their tyranny over their domains. If too many of them start dabbling in black magicks, that will change things - especially because current wizards can become future mages, taking their dark arts further up the chain. This is why the gods and demigods have to be ever-alert for shenanigans among the lesser powers. They might not be able to be involved only once a year (more like once or twice a month).
Having a backroom will help. That's how most mod teams handle things now. Something crops up, mods have a back channel (discord, slack, irc, private subs etc) where they can all talk it over and...
Having a backroom will help. That's how most mod teams handle things now. Something crops up, mods have a back channel (discord, slack, irc, private subs etc) where they can all talk it over and come to a consensus, then act with a unified front. Build that right into Tildes so it comes as a part of every group, like a ~groupname.gov that trusted users get in on early. Mods could even have their discussion then publish it up into the main group so everyone can see the decision making process.
That way when the majority is making a stupid decision, the voices of reason in their midst have a chance to convince them otherwise before they commit to something without thinking it through.
We're also gonna need good guides on how to moderate. It'll work better if people are informed and have a process to fall back on when they get their mod bits.
This is where I think your idealism won't survive reality. If a majority of users/super-users/moderators want to do something, then a minority speaking up will probably not change their minds. To...
That way when the majority is making a stupid decision, the voices of reason in their midst have a chance to convince them otherwise
This is where I think your idealism won't survive reality. If a majority of users/super-users/moderators want to do something, then a minority speaking up will probably not change their minds.
To draw on my own recent experiences moderating Star Trek subreddits, I could imagine a majority of people in ~tv.startrek deciding to report and eliminate any praise of 'Star Trek: Discovery' because it's not real Star Trek. Every time someone posts a comment praising 'Discovery', they'll get brigaded by other users tagging their comments as "noise" or "troll", and the moderators won't do anything to stop it because they agree. The minority of Trek fans who happen to like 'Discovery' can speak up all they want, but they're never going to change that behaviour.
This issue won't be resolved in a genteel backroom discussion. This will need someone from higher up to step in. A demigod will have to act and pull the wizards and mages into line.
And that's just a science fiction television show! I imagine it would get worse if something important was involved, like U.S. politics.
Really, in that scenario, there isn't a single mod who can speak up and convince the rest to find a better way to deal with it than deleting every mention - even by pointing out to them that it's...
Really, in that scenario, there isn't a single mod who can speak up and convince the rest to find a better way to deal with it than deleting every mention - even by pointing out to them that it's against site policy and that they risk losing their mod powers if they abuse them in that manner? How many mods are there in that situation - five? twenty? What happens when it's 500?
On reddit, nobody's going to school the mod team for being dickish - they can get away with it. If Tildes makes a few high profile examples out of rogue mod teams that pull this sort of crap it'll set a precedent. I'd also be willing to bet that a single backroom post from an admin is all it'll take to set most teams back on the right track again.
Lol, yeah as with most “debates” @Amarok and I have gotten in to over the last year, we do largely agree... we just have slightly different perspectives so tend to draw the lines at different...
You're effectively agreeing with @cfabbro. You might differ about the details, but you both agree with the concept that there has to be a central authority at the top of the pyramid reviewing everything (that authority might be a single person, or a small group of people, but it still exists as a central authority).
Lol, yeah as with most “debates” @Amarok and I have gotten in to over the last year, we do largely agree... we just have slightly different perspectives so tend to draw the lines at different angles. E.g. He is optimistic about democratization of all the systems, whereas I am exceptionally wary of its potential shortcomings. ;)
It's good to have someone to debate the finer points with. I think it may come down to getting people to put some faith in the systems. On a place like reddit, there's the sense that no matter...
It's good to have someone to debate the finer points with.
I think it may come down to getting people to put some faith in the systems. On a place like reddit, there's the sense that no matter what you do it's never going to be enough, and no one in positions of power actually cares about improving things. That makes it pretty easy to become pessimistic and walk away. Reddit's culture breeds mod apathy.
If we can give people the sense that here, it will matter and people will continue to build better systems, maybe we can pull away from the death spiral of negativity. Attitude really is everything. I love the attitude of this team. Everyone here just knows we can do this. :D
That's fair. The only current comparison we can really make is reddit... and we all know the top-level administration there leaves much room to be desired, which puts a lot more pressure on mods...
That's fair. The only current comparison we can really make is reddit... and we all know the top-level administration there leaves much room to be desired, which puts a lot more pressure on mods there since they have so little support, as well as provides room for others to misbehave without being punished. So all we can really do is guess as to how things will play out differently here.
Yeah, I've noticed that. :) Me, I'm pessimistically optimistic. I know that bad people can still misuse good tools. Ultimately, I'm relying on us having good people to ensure that doesn't happen....
He is optimistic about democratization of all the systems
Yeah, I've noticed that. :)
Me, I'm pessimistically optimistic. I know that bad people can still misuse good tools. Ultimately, I'm relying on us having good people to ensure that doesn't happen. Tildes' culture starts from the top down, with Deimos.
There was also a thread about it on /r/OutOfTheLoop, but that didn't add too much.
As I understand it, it seems the controversy started on Twitter (?) when he advertised and endorsed a block list that happened to include a lot of trans people. The block list had a heritage of coming out of a well-managed block list of spammers and flame bots, but the owner had since taken exclusive control of it. She apparently had a problem with trans people and a habit of blocking people on a whim or out of spite, and so a lot of innocent (as in not spammers or flame bots) trans people ended up on a widely used block list and shut out of much of the gaming industry's presence on Twitter.
Wil didn't know this when he advertised it, and his endorsement helped spread its use into the TV and film industry, further locking out these trans people on Twitter.
When the damage was revealed, he made a public effort to clear his own block list of these trans people, but not to encourage others to do the same nor to publicize tools that made it easier to do so. This ended with a whole lot of trans people permanently blocked out of large areas of Twitter due to one woman's pettiness and a whole lot of other people's complacency. Needless to say, this made trans people unhappy.
Then we move to Mastodon. Apparently, one of his close friends recently abused his girlfriend (the friend's, not Wil's - Wil is married). Wil has been reluctant to call him out on this or support the women leaving his (the friend's) projects over it. A Mastodon user asked him for comment on this (after pulling a petty "bofa deez nuts" joke on him?), and Wil got mad and blocked her. It happens to be that this user was trans, which added fuel to the fire.
From there, things devolved as one might expect on the Internet, and Wil got fed up, wrote the blog posted referenced on the OP, and left social media entirely.
Well, let's send him a Tildes invite then. I think the point where people nope out of social media in frustration is the perfect time to introduce them to Tildes. :)
Well, let's send him a Tildes invite then. I think the point where people nope out of social media in frustration is the perfect time to introduce them to Tildes. :)
Hah, if you have a way to reach him, go right ahead. ;) I do wonder, though, whether it's best not to involve celebrities at the stage we're in right now. I feel like that could draw an enormous...
Hah, if you have a way to reach him, go right ahead. ;) I do wonder, though, whether it's best not to involve celebrities at the stage we're in right now. I feel like that could draw an enormous crowd that we're just not prepared to handle yet.
The overwhelming majority of invites are coming from just a couple people using them in /r/tildes threads. Everyone else just gets five when Deimos tops everyone up every couple weeks. People can...
The overwhelming majority of invites are coming from just a couple people using them in /r/tildes threads. Everyone else just gets five when Deimos tops everyone up every couple weeks. People can PM Deimos for more if they have a larger group they want to bring in, but that doesn't happen much. That's pretty well controlled imo. We don't have to worry about 10k overnight growth spurts yet, thank god. That would definitely kill us. Assuming everyone has 5 in their profile right now that's about 30k invites waiting to be handed out.
I think having celebrity types around would provide some valuable perspective on the stalking and harassment. They've typically been exposed to much more of it over much longer periods of time than most people. They are used to being 'judged' every minute with every silly social media post. They'll have weathered more of these storms, enough to have some experience. I have to imagine people who've lived through that are going to have better insight into the problem and perhaps we can use that insight to figure out better ways to deal with those issues when they arise.
I've been wondering how much 'playing to the audience' affects online behavior. I think it's likely to be a significant factor. While Tildes is invite only we really don't have to worry about that effect - especially while it's closed to those without accounts.
Once again we see federation doesn't solve our problems. That shouldn't surprise anyone because we've had federation since the earliest days of the internet with usenet, irc, and email, yet the core tech of the early internet succumbed to the same problems. I'd go so far as to say federation and moderation (or more appropriately, self-governance systems) are separate problem domains without a lot of overlap.
Let's take a look at a fourth solution that the author hasn't thought of yet - but we have.
Get all of your userbase in on the moderation, helping out in whatever places they frequent. Make sure the power comes from democratic action in-aggregate, instead of from a small group of powerful accounts. This keeps it more honest and reliable because something has to get a rise out of a lot more people to get them to tag/report enough that it becomes a real issue.
Now let's have those reports reviewed by the mod tier users, hopefully several thousand of them per large group. If a batch of reports like that all coming in for the same incident is bullshit abuse (as it clearly was in this instance imo) then the accounts who abused that report feature have their reports penalized in the future, perhaps even losing access to the report system itself. It might be wise to restrict the maximum reporting capability of any single user as well, since having fewer reports to hand out would probably make people use them more judiciously and responsibly. I think just letting everyone spam report non-stop is a very bad idea - we've seen how that works on reddit.
Conversely, when reports are used well and validated by the trusted group's users, those doing the reporting have their reports prioritized or rewarded with greater access and greater trust. This should certainly feed into their rep with regard to becoming a future moderator-level user in those groups where they do good reporting.
New users are prevented from using this system for some time period, likely several months, so they have time to learn the community norms before being asked to enforce them.
People can also opt-out of all 'governance' activity in their profiles and never be bothered with moderation features. I wonder, would it be better to opt-in everyone by default, or opt-out everyone by default, and leave the governance only to the people who ask for it? Those two choices will create two very different groups of moderators, and I wonder which one is better...
IMO the key to that whole system is "Trust users but punish abusers" though. Even "in aggregate" and having actions be "democratic" and "verified by trusted users", the systems can still be subverted to commit unethical/unfair acts like targeted harassment or even just simply fall prey to tyranny of the majority. Without some form of trustworthy central authority to oversee the system and punish those who abuse it, it can quickly fall apart. This is one thing that Federation renders pretty much impossible too, which is why I think rather than solving the issues of the current social media systems (like witch hunts, doxxing and targeted harassment), it will actually make them much, much worse... as we basically just saw happen to Wil Wheaton who had no real choice but to flee the instance and ultimately chose to abandon mastodon entirely.
If it's done in the open with auditing, any unethical/unfair mob is going to be exposed and their actions reversed, without the need for a central authority - which is critical, because no central authority can handle the workload of vetting everything. If it isn't widely crowdsourced and still relies on paid staff or artificial-stupid algorithms, it's doomed in the long run. It has to be self-sustaining.
There is a role for a central authority - setting policy for what is and is not acceptable behavior. The central authority can also step in when the trusted group fails at moderating. That's where they are needed - they watch the watchmen, fix things when the system hits an edge case and breaks down. It's a much smaller number of people to oversee (modding the mods) and that might be doable with a paid staff, as long as it's good two-way communication all the time.
As for the tyranny of the majority, that problem isn't getting solved until someone invents something better than democracy. Limiting the number of reports from a single user, preventing the majority from seeing actions in-progress so they don't bandwagon, and making examples of hostile groups of users in very public fashion should all help reduce it. So will limiting the 'majority' who can act to that group's own regular users - everyone else is in drive-by mode.
The users have to handle it all, the vast majority of the time. That's the only way to get enough man-hours into the system to get the work done. I like the mindset where anyone reading a thread is responsible for moderating it by the simple virtue of being there.
We'll get to take another shot at that once we turn comment tagging back on here. That'll get us started. Should be fun! :D
Reversed by who, if not a central authority? Even with a significant trusted user pool there is still the distinct possibility that the majority of them will agree with the unethical behavior, especially so in smaller niche groups... which is what I meant by tyranny of the majority. All open auditing allows for is that unethical behavior to be discovered, but you still need someone in authority to punish them and reverse their actions. Not everything can be automated or democratized, nor should it be... punishment especially since that's where mob rule comes in to effect and sensibility/objectivity often goes out the window.
Reversed by the group's trusted users. If there's a mod team of thousands, and a clique of them start some shit that's in violation, the rest will notice and/or be notified by the regular users, and are free to act on it to police their own mod team. In the rare event that most of those people all lose their shit at the same time over the same issue, that's when the admins can step in.
So... no matter how decentralised and crowd-sourced the moderation process is, there still needs to be someone (or some ones) at the top of the pyramid with ultimate power. You've said it explicitly:
You're effectively agreeing with @cfabbro. You might differ about the details, but you both agree with the concept that there has to be a central authority at the top of the pyramid reviewing everything (that authority might be a single person, or a small group of people, but it still exists as a central authority). There'll probably be a few layers of moderator-types in this model, from tag-editors up to ban-masters, but every moderation action is being watched by everyone, and each level of moderators is being overseen by the level of moderators above it, which is in turn being overseen by the level of moderators above it, and so on. And, at the top of the pyramid sits Deimos.
Acolytes are overseen by wizards, who are in turn overseen by sorcerors, who are in turn overseen by mages, who are in turn overseen by archmages, who are in turn overseen by demigods, who are in turn overseen by lesser gods, who are in turn overseen by The God of Terror Himself.
Yep. I think where we disagree is how often the top levels get involved. Every step up that tree should be accompanied by a much greater distance between incidents. In your example, wizards doing 75% of the work and the god of terror getting involved like once a year sounds about right to me.
We have a bad tendency to dismiss issues by just pushing them up the chain and saying 'let the mods handle it' which isn't solving anything, it's just sweeping the problems under the rug. Push all the work up the chain and the people at the top end up unable to cope.
Most of the people and man hours are towards the bottom of the pyramid, so most of the moderation work has to get done at the bottom. I'll be pushing for that with every feature we come up with. The closer to the bottom and the more people who can get involved in any given system, the better that system is.
I tend to agree. I also think the wizards and the demigods are going to be different types of work. I don't imagine demigods will spend a lot of time moving topics to appropriate groups, for instance. They'll be more focussed on what groups and sub-groups should exist, so the wizards can move topics accordingly.
However, there is still the possibility of a tyranny of the majority. In our case, we hope the majority will be good wizards and mages and demigods practising white magicks, in which case we'll want them to have control and exert their tyranny over their domains. If too many of them start dabbling in black magicks, that will change things - especially because current wizards can become future mages, taking their dark arts further up the chain. This is why the gods and demigods have to be ever-alert for shenanigans among the lesser powers. They might not be able to be involved only once a year (more like once or twice a month).
Having a backroom will help. That's how most mod teams handle things now. Something crops up, mods have a back channel (discord, slack, irc, private subs etc) where they can all talk it over and come to a consensus, then act with a unified front. Build that right into Tildes so it comes as a part of every group, like a ~groupname.gov that trusted users get in on early. Mods could even have their discussion then publish it up into the main group so everyone can see the decision making process.
That way when the majority is making a stupid decision, the voices of reason in their midst have a chance to convince them otherwise before they commit to something without thinking it through.
We're also gonna need good guides on how to moderate. It'll work better if people are informed and have a process to fall back on when they get their mod bits.
This is where I think your idealism won't survive reality. If a majority of users/super-users/moderators want to do something, then a minority speaking up will probably not change their minds.
To draw on my own recent experiences moderating Star Trek subreddits, I could imagine a majority of people in ~tv.startrek deciding to report and eliminate any praise of 'Star Trek: Discovery' because it's not real Star Trek. Every time someone posts a comment praising 'Discovery', they'll get brigaded by other users tagging their comments as "noise" or "troll", and the moderators won't do anything to stop it because they agree. The minority of Trek fans who happen to like 'Discovery' can speak up all they want, but they're never going to change that behaviour.
This issue won't be resolved in a genteel backroom discussion. This will need someone from higher up to step in. A demigod will have to act and pull the wizards and mages into line.
And that's just a science fiction television show! I imagine it would get worse if something important was involved, like U.S. politics.
Really, in that scenario, there isn't a single mod who can speak up and convince the rest to find a better way to deal with it than deleting every mention - even by pointing out to them that it's against site policy and that they risk losing their mod powers if they abuse them in that manner? How many mods are there in that situation - five? twenty? What happens when it's 500?
On reddit, nobody's going to school the mod team for being dickish - they can get away with it. If Tildes makes a few high profile examples out of rogue mod teams that pull this sort of crap it'll set a precedent. I'd also be willing to bet that a single backroom post from an admin is all it'll take to set most teams back on the right track again.
We already know how to properly moderate a politics discussion thanks to some folks on reddit. We'll need to build systems to help with that challenge as well.
Lol, yeah as with most “debates” @Amarok and I have gotten in to over the last year, we do largely agree... we just have slightly different perspectives so tend to draw the lines at different angles. E.g. He is optimistic about democratization of all the systems, whereas I am exceptionally wary of its potential shortcomings. ;)
One might say you're in violent agreement. ;)
It's good to have someone to debate the finer points with.
I think it may come down to getting people to put some faith in the systems. On a place like reddit, there's the sense that no matter what you do it's never going to be enough, and no one in positions of power actually cares about improving things. That makes it pretty easy to become pessimistic and walk away. Reddit's culture breeds mod apathy.
If we can give people the sense that here, it will matter and people will continue to build better systems, maybe we can pull away from the death spiral of negativity. Attitude really is everything. I love the attitude of this team. Everyone here just knows we can do this. :D
That's fair. The only current comparison we can really make is reddit... and we all know the top-level administration there leaves much room to be desired, which puts a lot more pressure on mods there since they have so little support, as well as provides room for others to misbehave without being punished. So all we can really do is guess as to how things will play out differently here.
Yeah, I've noticed that. :)
Me, I'm pessimistically optimistic. I know that bad people can still misuse good tools. Ultimately, I'm relying on us having good people to ensure that doesn't happen. Tildes' culture starts from the top down, with Deimos.
I was trying to figure out what the whole controversy was about, and this was about the only rational article I could find:
https://medium.com/@AmberEnderton/wil-wheaton-has-a-listening-problem-accdf6277b88
There was also a thread about it on /r/OutOfTheLoop, but that didn't add too much.
As I understand it, it seems the controversy started on Twitter (?) when he advertised and endorsed a block list that happened to include a lot of trans people. The block list had a heritage of coming out of a well-managed block list of spammers and flame bots, but the owner had since taken exclusive control of it. She apparently had a problem with trans people and a habit of blocking people on a whim or out of spite, and so a lot of innocent (as in not spammers or flame bots) trans people ended up on a widely used block list and shut out of much of the gaming industry's presence on Twitter.
Wil didn't know this when he advertised it, and his endorsement helped spread its use into the TV and film industry, further locking out these trans people on Twitter.
When the damage was revealed, he made a public effort to clear his own block list of these trans people, but not to encourage others to do the same nor to publicize tools that made it easier to do so. This ended with a whole lot of trans people permanently blocked out of large areas of Twitter due to one woman's pettiness and a whole lot of other people's complacency. Needless to say, this made trans people unhappy.
Then we move to Mastodon. Apparently, one of his close friends recently abused his girlfriend (the friend's, not Wil's - Wil is married). Wil has been reluctant to call him out on this or support the women leaving his (the friend's) projects over it. A Mastodon user asked him for comment on this (after pulling a petty "bofa deez nuts" joke on him?), and Wil got mad and blocked her. It happens to be that this user was trans, which added fuel to the fire.
From there, things devolved as one might expect on the Internet, and Wil got fed up, wrote the blog posted referenced on the OP, and left social media entirely.
I hope that's accurate. 😞
Well, let's send him a Tildes invite then. I think the point where people nope out of social media in frustration is the perfect time to introduce them to Tildes. :)
Hah, if you have a way to reach him, go right ahead. ;) I do wonder, though, whether it's best not to involve celebrities at the stage we're in right now. I feel like that could draw an enormous crowd that we're just not prepared to handle yet.
The overwhelming majority of invites are coming from just a couple people using them in /r/tildes threads. Everyone else just gets five when Deimos tops everyone up every couple weeks. People can PM Deimos for more if they have a larger group they want to bring in, but that doesn't happen much. That's pretty well controlled imo. We don't have to worry about 10k overnight growth spurts yet, thank god. That would definitely kill us. Assuming everyone has 5 in their profile right now that's about 30k invites waiting to be handed out.
I think having celebrity types around would provide some valuable perspective on the stalking and harassment. They've typically been exposed to much more of it over much longer periods of time than most people. They are used to being 'judged' every minute with every silly social media post. They'll have weathered more of these storms, enough to have some experience. I have to imagine people who've lived through that are going to have better insight into the problem and perhaps we can use that insight to figure out better ways to deal with those issues when they arise.
I've been wondering how much 'playing to the audience' affects online behavior. I think it's likely to be a significant factor. While Tildes is invite only we really don't have to worry about that effect - especially while it's closed to those without accounts.