14 votes

Topic deleted by author

12 comments

  1. [6]
    pseudolobster
    Link
    I'm afraid that's a problem that's existed since the beginning of the internet, and no one has fully solved it yet. The best you can hope to do is make it inconvenient enough for bad actors that...

    I'm afraid that's a problem that's existed since the beginning of the internet, and no one has fully solved it yet. The best you can hope to do is make it inconvenient enough for bad actors that they give up. Require email for registration, ban throwaway email services. You can IP ban them, then ban proxies, vpns, tor, etc. You can require a small fee for registration. You can make the site invite-only and have a community vetting process. You could implement something akin to shadowbanning, where the user doesn't immediately know they've been banned. All of these are stopgaps though, and if someone's determined to be a troll it's often a losing proposition to fight them. A lot of these measures decrease the quality of the site experience for regular users.

    The one I've seen work the best has been charging five bucks for an account. This is unfortunately a huge barrier to some users who won't be able to pay, but it does force trolls to keep paying out for new accounts every time they're banned. This has worked well for some very long running forums such as metafilter and SA, both of whom have been online since the 90's.

    15 votes
    1. [5]
      tomf
      Link Parent
      I like a combination of invite-only (where you're accountable for your invites) and the buy-in option. $5 is a good figure, too. Since this community started coming up, I figured that all of the...

      I like a combination of invite-only (where you're accountable for your invites) and the buy-in option. $5 is a good figure, too.

      Since this community started coming up, I figured that all of the comments would be in ~humanities.askbiblescholars, with that subgroup being removed from the default feeds.

      4 votes
      1. [4]
        cfabbro
        (edited )
        Link Parent
        I'm all for invite-only too, but I personally really, really, really hate the idea of buy-ins for accounts. Not because I can't afford them myself, but because I know a lot of amazing people who...

        I'm all for invite-only too, but I personally really, really, really hate the idea of buy-ins for accounts. Not because I can't afford them myself, but because I know a lot of amazing people who can't, or have no access to credit cards in the first place... a number of Tildes quality contributors included. Sure, it's an easy way to reduce the number of moderation issues you have to deal with, but it's also discriminatory against the economically disadvantaged (no matter how little you charge) and so ends up reducing diversity on the site.

        9 votes
        1. tomf
          Link Parent
          Yeah, this is why I like a combination. Need an invite and don’t know anyone? Buy-in or email us. I’m on a few sites that use this method and it works well.

          Yeah, this is why I like a combination. Need an invite and don’t know anyone? Buy-in or email us.

          I’m on a few sites that use this method and it works well.

          4 votes
        2. [3]
          Comment deleted by author
          Link Parent
          1. [2]
            cfabbro
            Link Parent
            Yeah, for a blog, I guess paid accounts are a lot less problematic so long as the contents are still public. But for social media sites, it's a different story.

            Yeah, for a blog, I guess paid accounts are a lot less problematic so long as the contents are still public. But for social media sites, it's a different story.

            2 votes
            1. [2]
              Comment deleted by author
              Link Parent
              1. highsomatic
                Link Parent
                You can have the comments be manually approved by you or a trusted user before appearing and combine that with a time limit on comment submission. I imagine that would discourage a lot of flaming...

                You can have the comments be manually approved by you or a trusted user before appearing and combine that with a time limit on comment submission. I imagine that would discourage a lot of flaming attempts since the perpetrator can't reach their audience on their own terms.

                4 votes
  2. [2]
    reifyresonance
    Link
    Degrade quality of service to known accounts, say, take 2-15s to return a page, and give cryptic errors 1/3 of the time they try to POST anything. Make it inconvenient and not fun for them and...

    Degrade quality of service to known accounts, say, take 2-15s to return a page, and give cryptic errors 1/3 of the time they try to POST anything. Make it inconvenient and not fun for them and maybe they'll get bored. You can ramp this up slowly, or do it all at once. (This is more for dealing with a known bad actor who keeps trying to evade a ban, rather than an every-case solution)

    7 votes
    1. [2]
      Comment deleted by author
      Link Parent
      1. DataWraith
        Link Parent
        Just wanted to chime in to say that this does work. There was a now-defunct German website that successfully did this via CAPTCHAs. The fun thing about it was that it was randomized, so if you...

        Just wanted to chime in to say that this does work.

        There was a now-defunct German website that successfully did this via CAPTCHAs. The fun thing about it was that it was randomized, so if you used an unwanted phrase in your comment, you'd get a high(er) chance of having to solve a CAPTCHA before the comment goes through. Each additional bad word or phrase would increase the probability, and then they repeated the check after every challenge: if your initial chance to see a CAPTCHA was 90% you actually had a 81% chance of seeing two CAPTCHAs, 72% of seeing three, and so on... it would be kind of evil if it weren't a measure to prevent abuse.

        Some people solved 15 or more challenges in order to post their bile, but it takes quite a bit of time, which is why the system was called Trollbremse (trolling brake). The site owners considered it a win, since these people, while solving CAPTCHAs, were wasting time they could otherwise have used for more trolling...

        7 votes
  3. stu2b50
    Link
    An IP ban is one option. There are some packages that do this for you, but in this case you may want to roll it on your own to better integrate into your systems. For instance, create two tables:...

    An IP ban is one option. There are some packages that do this for you, but in this case you may want to roll it on your own to better integrate into your systems. For instance, create two tables: IP banlist and user IPs.

    Whenever a user logs in or registers with another unique IP, add another row to user IPs. Then, when you "ban" a user, add all of their IPs to the ban list. When they try to register or log in with an IP on the banlist, just send a 403 forbidden or something.

    Of course, it's simply annoying to change your IP, far from impossible. You can also try blocking entire subnet masks but tbh they'll probably change IPs with a VPN anyway.

    If it's with bots, you can try a captcha. Email verification with uniqueness enforced is also another method. All of this just makes it more costly to do; a user who really wants to make more accounts can do so, it's not really possible to prevent it, just hopefully make it too annoying to be worth it.

    6 votes
  4. teaearlgraycold
    Link
    I think your best bet is to shadow-ban bad actors. And if possible, show content from shadow-banned users not just to the user, but to anyone from one of the user's IPs.

    I think your best bet is to shadow-ban bad actors. And if possible, show content from shadow-banned users not just to the user, but to anyone from one of the user's IPs.

    6 votes
  5. joplin
    Link
    One thing I've seen is that new accounts go to moderation until x good posts have been made, or a moderator OKs them. Or it could be time-based. All posts from new accounts are moderated until 1...

    One thing I've seen is that new accounts go to moderation until x good posts have been made, or a moderator OKs them. Or it could be time-based. All posts from new accounts are moderated until 1 month after their first post. (I say after their first post rather than after account creation because it keeps a person from registering an account and letting it sit for a month to evade moderation.)

    5 votes
  6. WendigoTulpa
    Link
    You could try having a minimum account age of 1 month before posting, though I guess the person could just make a bunch of new accounts at a time, it might be annoying for them to do that and keep...

    You could try having a minimum account age of 1 month before posting, though I guess the person could just make a bunch of new accounts at a time, it might be annoying for them to do that and keep track of it all. You could also probably tell if a single IP is registering several accounts in one day.

    4 votes