21 votes

[Suggestion] Use sortition for moderation ?

In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process originates from the idea that “power corrupts.” For that reason, when the time came to choose individuals to be assigned to empowering positions, the ancient Athenians resorted to choosing by lot. In ancient Athenian democracy, sortition was therefore the traditional and primary method for appointing political officials, and its use was regarded as a principal characteristic of true democracy.

Today, sortition is commonly used to select prospective jurors in common law-based legal systems and is sometimes used in forming citizen groups with political advisory power (citizens' juries or citizens' assemblies).

The mechanics would be something like this: users report a post/comment, when there's enough reports the systems randomly selects 3/5/7/... currently active users and ask them to determine if the reported post contravene to the rules. The decision is then automatically taken with a majority rule.

Why ?

  1. It's the only system that scales (to my knowledge). More users mean more content to moderate, but since the users are also moderators the system works at any scale. Systems that don't scale lead to all kind of problems when the number of users become large.
  2. It's very robust to manipulation. As moderators are chosen randomly it's very hard to coordinate or try to influence the decisions.
  3. It promotes a participatory attitude and a sense of responsibility in the users. There's no "them against us" (the bad mods against the users).

21 comments

  1. [8]
    DanBC
    Link
    I like the idea of selecting moderators randomly. I know that I would be a terrible moderator. No-one wants me to have any mod powers (other than editing my own submission titles). I don't want...

    I like the idea of selecting moderators randomly.

    I know that I would be a terrible moderator. No-one wants me to have any mod powers (other than editing my own submission titles). I don't want mod powers.

    I don't know why I think randomly selecting mods is a good idea when I know that I'd be so awful at it.

    11 votes
    1. [3]
      Staross
      Link Parent
      If I show you a hateful comment and ask you "does this comment contravene to the no-hate rule ?" do you really think you wouldn't be able to answer that ? Anyway juries don't need to be perfect,...

      If I show you a hateful comment and ask you "does this comment contravene to the no-hate rule ?" do you really think you wouldn't be able to answer that ?

      Anyway juries don't need to be perfect, that the reason why you take a bunch of them and apply a majority rule.

      6 votes
      1. unknown user
        Link Parent
        This assumes that only hateful content will be flagged; if that were the case, then all flagged content could be removed immediately without any need for moderation.

        If I show you a hateful comment

        This assumes that only hateful content will be flagged; if that were the case, then all flagged content could be removed immediately without any need for moderation.

        2 votes
      2. DanBC
        Link Parent
        I can do that. My tolerance for hate is lower than other peoples, and so I would be saying lots of stuff breaks a no-hate rule where other people wouldn't.

        If I show you a hateful comment and ask you "does this comment contravene to the no-hate rule ?" do you really think you wouldn't be able to answer that ?

        I can do that. My tolerance for hate is lower than other peoples, and so I would be saying lots of stuff breaks a no-hate rule where other people wouldn't.

        1 vote
    2. crius
      Link Parent
      Yes, on top of that, it's not a scalable system. A system like this is not trivial to implement, technically speaking.

      Yes, on top of that, it's not a scalable system.

      A system like this is not trivial to implement, technically speaking.

      2 votes
    3. [2]
      pseudolobster
      Link Parent
      You have moderation powers on Wikipedia by default, on most articles without an account or a trust system involved. Have you ever felt compelled to use the moderation powers you've been given on...

      You have moderation powers on Wikipedia by default, on most articles without an account or a trust system involved.

      Have you ever felt compelled to use the moderation powers you've been given on wikipedia for evil?

      1 vote
      1. DanBC
        Link Parent
        Yes. Wikipedia is a great example. How many words do you think they have for the dash symbol in "Mexican-American War"? (Not about the war itself, only about the punctuation used for the article's...

        Have you ever felt compelled to use the moderation powers you've been given on wikipedia for evil?

        Yes. Wikipedia is a great example. How many words do you think they have for the dash symbol in "Mexican-American War"? (Not about the war itself, only about the punctuation used for the article's title.)

        There's something like 500,000 words written about that hyphen / m-dash / n-dash symbol. It went all the way to arbcom.

        I really don't want the hell of Wikipedia over here.

        2 votes
  2. [6]
    Algernon_Asimov
    Link
    The method used by StackExchange, which is similar to the method planned here at Tildes, seems to work. Users who demonstrate good behaviour, which is indicated by getting upvoted, are given the...

    It's the only system that scales (to my knowledge).

    The method used by StackExchange, which is similar to the method planned here at Tildes, seems to work. Users who demonstrate good behaviour, which is indicated by getting upvoted, are given the power to do some low-level moderation such as editing tags & titles. If they do well at that, they are stepped up to more involved moderation duties. As the number of users grows, the number of trusted users grows inevitably. More users means more moderators.

    It's very robust to manipulation. As moderators are chosen randomly it's very hard to coordinate or try to influence the decisions.

    With a different random group of moderators considering every reported post, it'll be hard to coordinate any consistent approach.

    It promotes a participatory attitude and a sense of responsibility in the users. There's no "them against us" (the bad mods against the users).

    Only among users who want to be engaged. Requiring a majority vote on every single reported item could get quite tiresome. I can see some people just ignoring the request to review an item (you mention jury duty - look at how many people don't want to be jurors). This will leave two groups of people engaged: the people who want to moderate and the trolls. And, when you're selecting people randomly, you'll have no idea how many moderators, apathetics, and trolls you'll get in a particular mix. If the apathetics outnumber the moderators, the trolls will win.

    7 votes
    1. arghdos
      Link Parent
      As a counter point, I like the idea of Sortition as an "introduction" to moderation. I suspect it's true that not all good users would be good moderators, and that many more simply would not like...

      Users who demonstrate good behaviour, which is indicated by getting upvoted, are given the power to do some low-level moderation such as editing tags & titles. If they do well at that, they are stepped up to more involved moderation duties. As the number of users grows, the number of trusted users grows inevitably. More users means more moderators.

      As a counter point, I like the idea of Sortition as an "introduction" to moderation. I suspect it's true that not all good users would be good moderators, and that many more simply would not like to take up that mantle. Sortition provides a fairly streamlined way to determine who to transition from power user to moderator.

      For instance, when a users trust rating starts to get to the point where they would start getting moderator duties, a good test of whether that user would be a good moderator is to randomly select some reported comments / posts for them to review. I picture this similar to reddit, where the reports show up underneath a comment or post, and a drop-down menu reveals the text of the reports and some action buttons (including one that says "stop showing me this" to let a user opt-out of moderation). Let a new moderator make their decision on the content, if more experienced moderators agree with that decision, that's a sign that the user will likely be a good addition to the moderation team (and of course, the converse is true as well).

      Like you, I agree that majority rule is a poor method to implement moderation, however I could see a threshold system working quite well for this purpose. For instance, a moderator could have a separate "trust" score based on their moderation activities, a new mod would have almost no moderation trust and an experienced, active, competent moderator a lot of trust. A new mod's decision to remove a comment or post might not be enough on it's own, but after a trust threshold has been reached, e.g., via the decisions of multiple new moderators or an experienced mod, the content would be removed.

      4 votes
    2. [4]
      Staross
      Link Parent
      You could have some conditions (good behaviour, karma, ...) that would allow someone to be into the pool of potential juries. People you just enforce the rules, which I think is the right kind of...

      You could have some conditions (good behaviour, karma, ...) that would allow someone to be into the pool of potential juries.

      With a different random group of moderators considering every reported post, it'll be hard to coordinate any consistent approach.

      People you just enforce the rules, which I think is the right kind of consistency. Of course the rules need to be good and clear, but that's another issue.

      Requiring a majority vote on every single reported item could get quite tiresome.

      A single moderator moderating everything also get tiresome. More people doing the work = less work per person.

      I think there should also be some kind of reward to incite people doing this moderation.

      One issue with granting permanent rights to people is that they can change, go crazy, sell their account, or just play nice for a while until they get power. With sortition+majority you can't do that because you only have limited power for a limited amount of time, and no control over it. Trying to game the system is very unrewarding.

      2 votes
      1. [3]
        Algernon_Asimov
        Link Parent
        You're very close to proposing the "trusted user" model that StackExchange uses, and which is proposed for Tildes. :) Even with the clearest rules in the world, there are always going to be grey...

        You could have some conditions (good behaviour, karma, ...) that would allow someone to be into the pool of potential juries.

        You're very close to proposing the "trusted user" model that StackExchange uses, and which is proposed for Tildes. :)

        People you just enforce the rules, which I think is the right kind of consistency. Of course the rules need to be good and clear, but that's another issue.

        Even with the clearest rules in the world, there are always going to be grey areas, and different interpretations. I've had lots of discussions with fellow moderators about whether a particular post or comment breaks the rules or not - and there are always different opinions. Of course, you're proposing a vote on each reported item, which should smooth things out - but you'll still end up with some moderator decisions going one way and some moderator decisions going another way, depending on which particular users were randomly selected to review an item.

        More people doing the work = less work per person.

        I never said that the only other option was "a single moderator". I explicitly described a model with multiple users progressing to moderator status.

        With your model, you're always co-opting a new group of people for each reported item. If there are lots of reports, then the number of required reviewers increases until each user will find themself being asked to review multiple items every day even if they don't want. These people aren't going to suddenly be motivated just because you give them work to do.

        I think there should also be some kind of reward to incite people doing this moderation.

        Well, we can't pay them. The only thing we can offer them is perks & prestige. What if we were to "promote" them to more senior roles on the website, with more abilities and more recognition? "Staross is a Level 3 mod."

        One issue with granting permanent rights to people is that they can change, go crazy, sell their account, or just play nice for a while until they get power.

        The "trusted user" model proposed for Tildes seems to include a "use it or lose it" aspect. If you don't keep moderating, you'll lose your mod abilities. And if your moderator decisions turn bad, you'll lose your mod abilities. In other words, if you change, or sell your account, you'll soon find yourself no longer being a moderator.

        4 votes
        1. [2]
          sublime_aenima
          Link Parent
          Do you have a link to the proposed mod system here? My initial reaction to a system that rewards votes with mod permissions is “oh hell no!” Some of the best mods I know on reddit are ones that...

          Do you have a link to the proposed mod system here? My initial reaction to a system that rewards votes with mod permissions is “oh hell no!” Some of the best mods I know on reddit are ones that have minimal karma (if any). At the same time we’ve probably all seen how the quest for karma and/or power has made some subs on reddit implode, or subs simply be fronts for spam.

          4 votes
          1. Algernon_Asimov
            Link Parent
            The future mechanics of trust and moderation are outlined here. They're not extremely detailed. There have been many other discussions in ~tildes and ~tildes.official about how this might work in...

            The future mechanics of trust and moderation are outlined here. They're not extremely detailed. There have been many other discussions in ~tildes and ~tildes.official about how this might work in detail. It's definitely a work in progress. However, whenever I've said this seems similar to how StackExchange operates (which I'm acquainted with), noone has ever contradicted me, so you could do some further reading over there.

            It's important to note that it's not as simplistic as "votes = moderator". There will be multiple levels of moderator powers. Getting upvoted will only get you on to the lowest rung of moderation (probably something like editing tags & titles of posts). To move up the rungs to gain higher moderator powers, you have to demonstrate good moderation on the lower rungs. The only people who will become fully-fledged all-powerful moderators are people with a long history of good moderation at lower levels.

            2 votes
  3. [4]
    vakieh
    Link
    Majority rule is not appropriate, because there is nothing stopping the majority from being dumb, hateful people. You could apply that system to the trust system, but instead I believe ~ is going...

    Majority rule is not appropriate, because there is nothing stopping the majority from being dumb, hateful people. You could apply that system to the trust system, but instead I believe ~ is going for a 'why select 5% of the trustworthy users when you can just select 100% of them?' approach.

    5 votes
    1. [3]
      Staross
      Link Parent
      If the majority of your user base are dumb, hateful people, you've got bigger problems than just moderation rules and you probably want to close your website. I don't think it's a reasonable...

      If the majority of your user base are dumb, hateful people, you've got bigger problems than just moderation rules and you probably want to close your website. I don't think it's a reasonable assumption to work under.

      3 votes
      1. [2]
        unknown user
        Link Parent
        Even if a vast majority of users are "good", a random sample can still contain zero of those people. There's no assumption that most users are hateful.

        Even if a vast majority of users are "good", a random sample can still contain zero of those people. There's no assumption that most users are hateful.

        4 votes
        1. Staross
          Link Parent
          Yes but it's not very probable. If you 5% of your user base is hateful, only 1/800 juries will have a majority of hateful for a jury of 5. With 7 it drops to 1/5000. Plus you could have some...

          Yes but it's not very probable. If you 5% of your user base is hateful, only 1/800 juries will have a majority of hateful for a jury of 5. With 7 it drops to 1/5000.

          Plus you could have some karma/good behavior requirement to be eligible for jury duty.

          I don't think it's a serious issue.

          3 votes
  4. [2]
    Greg
    Link
    This is fairly reminiscent of the Slashdot moderation system: accounts meeting a basic minimum set of criteria are randomly assigned five "mod points" which allow up to 5 posts to be moderated. If...

    This is fairly reminiscent of the Slashdot moderation system: accounts meeting a basic minimum set of criteria are randomly assigned five "mod points" which allow up to 5 posts to be moderated. If they aren't used within 3 days, they expire.

    That was used as the only way to up/downvote any posts, so it doesn't map exactly, but at its peak it was demonstrably effective on a very large scale. Taking the same approach to reported posts sounds broadly sensible to me.

    All that said, Slashdot went pretty sharply downhill in more recent years, so always worth treating it with a measure of caution. Personally, I don't think that the moderation system was specifically to blame, but it's clear that something went wrong...

    3 votes
    1. Crespyl
      Link Parent
      The Slashdot system was also supported by "meta-moderation" that allowed users to review and affirm or deny individual moderation events. Users who were judged to be correctly applying the rules...

      The Slashdot system was also supported by "meta-moderation" that allowed users to review and affirm or deny individual moderation events. Users who were judged to be correctly applying the rules would be more likely to receive mod points in the future, and users who failed, less likely.

      Despite the waning userbase and corporate/leadership turmoil at Slashdot over the last several (too many) years, I think even now the comments there are still pretty well managed and easy to sort through. Being able to filter specifically by comment tags (Informative/Funny/etc) and votes (with all comments capped at +5) makes it still my favorite implementation of a comment section, despite the other problems with the site.

  5. tomf
    Link
    One sub I mod alone (for the most part), I have comments that have been reported three times automatically filtered off. In the three or four years since I took over the sub, only one comment has...

    One sub I mod alone (for the most part), I have comments that have been reported three times automatically filtered off. In the three or four years since I took over the sub, only one comment has ever been falsely filtered.

    I like this idea, but there is still a near for overall moderation to guide a community. If there were a karma-threshold of 500 to vote on reports, that could lessen a lot of the load from said-community's mod team.

    Having random people would also work, but that also relies on those people being a) present, b) willing, and c) discerning.

    Overall, the key to a successful moderation is to manage the community's expectations -- and there is a fine line between laying out the rules and consequences and giving the trolls a line to walk. This is where static leadership is critical.

    2 votes