12 votes

Discussion time: What are your opinions on digital censorship?

Given the furor on Youtube over the new COPPA laws and the way they will affect content creators, I'd sort of like to know the opinions of tildezens on the topic of digital regulation. What do you guys think about heavy moderation or even censorship on social media platforms? Do you think it's a necessary way to keep online communities safe for all, or is it stifling open discussion on the internet?

(Not sure if should be placed in here or tech, please don't flame)

8 comments

  1. rogue_cricket
    (edited )
    Link
    I think moderation is important for communities, and I think often the framing of something as "censorship" is kind of alarmist. Like if a manager kicks someone out of a grocery store for ranting...

    I think moderation is important for communities, and I think often the framing of something as "censorship" is kind of alarmist. Like if a manager kicks someone out of a grocery store for ranting about how the Jews control the media, nobody gets up in arms about how he's being unfairly censored. He expressed a nasty viewpoint in an inappropriate way in an inappropriate place and he was removed from the space. That's how I feel about 80-90% of online "censorship".

    Sacha Baren Cohen recently used the phrase "it's freedom of speech, not freedom of reach" and I generally agree. You can say what you like. However, people who own spaces where they allow the public to comment or make their views known, including businesses like websites and events, are not obligated to give you a megaphone. I would actually be pretty horrified if I were required to host someone who wants to espouse harmful views with my server space or a physical space which I have paid for. I should not have to use my resources to let them amplify their opinions for free. As long as people are free to protest outside about it - entirely fine by me! - freedom of expression is maintained.

    I also believe that a complete lack of moderation results in a worse community with less diversity of ideas because you end up with less diversity of people. If you allow the most toxic and unpleasant people to post, harass, bully, and otherwise make interaction negative... then people will leave. And the people who will leave are the people who are generally more often the targets of this harassment - usually sexual and gender minorities, nonwhite people. The people who won't leave are the kind of people who relish the ability to espouse viewpoints that are actively hostile to others, and it gets worse and worse and more and more toxic over time as these views become entrenched as part of the community's culture. It becomes even more "echo-chambery" than a community that is moderated too tightly, in my opinion.

    EDIT: I do want to mention as an aside that a monopoly publisher, bankroller, or host is obviously quite bad for this - I think the problem is the monopoly itself, not the fact that the monopoly has the power to kick people off of their platform or refuse to publish things they do not want to publish. It is the monopoly that should be prevented or broken up, not their freedom to choose what to host. The root cause there is that these media/publishing resources have accumulated into the hands of too few people but that's a whole thing on its own.

    27 votes
  2. [3]
    Amarok
    Link
    My chief concern is that 'advertiser friendliness' seems to be driving these decisions. I think we need to do a lot better than that, it's a shitty metric. Moderation is essential, I've no doubts...

    My chief concern is that 'advertiser friendliness' seems to be driving these decisions. I think we need to do a lot better than that, it's a shitty metric. Moderation is essential, I've no doubts at all about that... but it should be more about community than making money.

    15 votes
    1. [2]
      dubteedub
      Link Parent
      Honestly, if that is what it takes for corporations to rationalize stricter moderation and keeping nazis off their platforms, im all for it. Yes, it should be the moral / ethical choice for them...

      Honestly, if that is what it takes for corporations to rationalize stricter moderation and keeping nazis off their platforms, im all for it. Yes, it should be the moral / ethical choice for them to make, but it also makes business sense in the long run.

      3 votes
      1. Amarok
        Link Parent
        All it means is anything deemed controversial will be censored on most services. Doesn't seem like ethics are involved at all, just money. That money will easily keep popular places open that...

        All it means is anything deemed controversial will be censored on most services. Doesn't seem like ethics are involved at all, just money. That money will easily keep popular places open that ought to be dealt with on other services as well. If reddit weren't overly concerned with their bottom line they'd have burned the_donald a long time ago, for example.

        Money and ethics are rarely on the same side. It is a start, though - at least people are talking about the problems.

        4 votes
  3. [3]
    skybrian
    Link
    It seems like the conventions are still evolving. Most people will voluntarily tag things as NSFW which I think indicates widespread support and enablement of one form of moderation. On the other...

    It seems like the conventions are still evolving. Most people will voluntarily tag things as NSFW which I think indicates widespread support and enablement of one form of moderation. On the other hand trigger warnings are controversial. It seems like simpler and less-controversial labels have an evolutionary advantage?

    Some kind of "ask your parents before watching this" tag might make sense but it seems nobody has come up with something simple and snappy enough yet?

    5 votes
    1. [2]
      moocow1452
      Link Parent
      Maybe something a little more established like "Viewer Discretion Advised" or "Content Advisory" would work better? "Content Warning" puts the emphasis on the Content being an issue, rather than...

      Maybe something a little more established like "Viewer Discretion Advised" or "Content Advisory" would work better? "Content Warning" puts the emphasis on the Content being an issue, rather than the personal "Trigger" on an individual.

      2 votes
      1. skybrian
        Link Parent
        Those all seem too official. I think something casual-sounding like "AYPF" (ask your parents first) would be more likely to be adopted? But a convention like that would be more likely to be used...

        Those all seem too official. I think something casual-sounding like "AYPF" (ask your parents first) would be more likely to be adopted? But a convention like that would be more likely to be used in a forum where you know kids are around. The default assumption on the Internet is to assume without evidence that they aren't.

        And that's still too simple since what's appropriate depends on age. Like, what's the age of the kids you're thinking about when you decide whether to use this tag?

        It's all very muddled. But libraries do have a children's section and it seems to work out? Maybe YouTube should be asking whether videos should be included in the children's section, and people would understand better whether to check the box?

        2 votes
  4. mrbig
    (edited )
    Link
    The word "censorship" is applied very loosely in many contexts. In most cases, I associate censorship with the actions of dictatorships or elected governments that act like dictatorships. One of...

    The word "censorship" is applied very loosely in many contexts. In most cases, I associate censorship with the actions of dictatorships or elected governments that act like dictatorships. One of the main differences between censorship and moderation is that censorship can suggest alterations and prevent something from being published or divulged beforehand, while moderation usually happens after the fact.

    Private companies can practice censorship when they are de-facto monopolies (the case of YouTube), and their acts of moderation clearly intend to suppress dissent. Notice that suppressing dissent for the sake of it is most frequently bad, but this does not mean they must always give equal space for two or more sides of a dispute (see false balance). More important than that is for moderation teams and algorithms to act in a manner that is consistent with their policies. If group A requires more enforcement than B, so be it.

    3 votes