11 votes

Tech Execs Face Jail In Australia If They Don’t Remove “Abhorrent” Content Quickly Enough

10 comments

  1. [9]
    knocklessmonster
    Link
    This is a bad move. I'm not typically one to side with Facebook, but if they had a way to instantly stop violent footage on their platform instantly, they would have found it. Hell, half an hour...

    This is a bad move. I'm not typically one to side with Facebook, but if they had a way to instantly stop violent footage on their platform instantly, they would have found it. Hell, half an hour was plenty fast for a site as large as facebook.

    Facebook would have to remove all violent content or start blocking videos as soon as they were reported to comply with Australia's laws, and tge latter woulc create an easy to abuse reporting system.

    9 votes
    1. [2]
      FZeroRacer
      Link Parent
      I'm hesitant to side with Facebook because this is purely a beast of their own making. Tech companies have prioritized growth at all costs while not designing their platform to scale in many ways...

      I'm hesitant to side with Facebook because this is purely a beast of their own making. Tech companies have prioritized growth at all costs while not designing their platform to scale in many ways that are currently causing vast amounts of societal harm. Stricter laws and punishments are in effect a long-overdue reckoning for recklessness.

      14 votes
      1. nothis
        (edited )
        Link Parent
        I get the concern but this law basically outlaws amateur live streaming. At least facebook could afford an army of content-moderators (although they already have that, which puts into question the...
        • Exemplary

        I get the concern but this law basically outlaws amateur live streaming.

        At least facebook could afford an army of content-moderators (although they already have that, which puts into question the feasibility), most sites couldn't do that. Note that this wasn't like the police called them to take it down and it took them an hour after having been officially notified (which would be harsh enough), they're expected to implement the infrastructure to detect something as specific and obscure as this video on their own and correctly respond to it within less than an hour. It's a demand deeply at odds with how the internet works in general, even in a more idealistic view. I'd consider it close to impossible to comply with that. It's a traditional publisher-focused law which makes sense for having a few dozen TV stations broadcasting to millions. But this is a few million people, mostly broadcasting to a dozen or so.

        It always strikes me how the internet is both more public and more private than pretty much any medium before it. It really requires re-thinking our views of media laws. I think the correct analogy, considering just how ingrained the internet is in everyday life, is drugs. Yes, you can ban stores selling it, but you can't keep people who really want them from acquiring them somehow. The Australian law, as applied to here, would be like jailing the owner of a fast food chain because someone sold drugs to a lot of people in there. Basically, this is the result of providing a public space. It's reasonable to think of the internet as a public space.

        I think between fake news, alt-right hate brigades and terrorist propaganda videos, there has to be a conversation about how to handle such content but "jail time for social media execs" isn't really one of them. I think making such videos illegal to host makes sense but the law should account for a reasonable delay for deletion.

        8 votes
    2. [7]
      Comment deleted by author
      Link Parent
      1. [2]
        papasquat
        Link Parent
        Because facebook didn't produce the content people upload. Your analogy is a little flawed. It would be like the telephone pole manufacturers being liable for the news broadcasts transmitted over...
        • Exemplary

        Imagine a news station had run that footage for half an hour through some public submission system. Would anyone defend them? So what makes half an hour for Facebook becoming aware of the video a good metric?

        Because facebook didn't produce the content people upload. Your analogy is a little flawed. It would be like the telephone pole manufacturers being liable for the news broadcasts transmitted over the cables they hold.
        Facebook, like every other social media provider, designed a system where users can directly upload content. It has nothing to do with how they designed the system beyond that point. If someone uploaded the christchurch video on this site while Deimos happened to be sleeping, you'd better believe it would take over an hour to remove as well. Should he be held liable for that? Of course not, that's ridiculous. Either you're ok with users being able to upload content to the internet semi anonymously or you're not. That's pretty much the whole discussion.
        Personally I'm OK with abhorrent content slipping through the cracks every now and again if it means that normal people without tons of resources are able to create and share things that everyone can see. Doesn't seem like Australia feels the same way. I feel bad for the citizens of that country.

        12 votes
        1. [2]
          Comment deleted by author
          Link Parent
          1. papasquat
            Link Parent
            Even so, television is a one to many communication system. A TV station is responsible for a single video stream, which makes it ridiculously easy to screen. They're effectively producing all of...

            My analogy specifically said "public submission system" which would mean the station did not produce the content either. The point is they would have made the choice to have such a system, just as Facebook has chosen to have such a system.

            Even so, television is a one to many communication system. A TV station is responsible for a single video stream, which makes it ridiculously easy to screen. They're effectively producing all of the content on their station because it has to go through dozens of people before it's published. This stuff does still happen occasionally though. People flash cameras in front of live news broadcasts, scream obscenities, and two journalists were murdered on live TV once, which millions of people have seen at this point. Should the television stations be punished for publishing that content?

            Facebook wants people to upload videos.

            Same goes for every video sharing service in existence, for profit or not. Why would anyone build and release something they didn't want anyone to use? The concept is completely applicable to tildes. The site is built around sharing content. If someone shared something illegal in a text format, the administrators of this site would be responsible for taking it down. Despite that we still live in the real world, and putting the burden on anyone who operates a platform where users can upload content that nothing bad should ever appear on that platform means that those platforms cannot functionally exist.

            I don't see how you can want these platforms to exist but also place that impossible standard on them. Realistically the only way anything will ever change with them is via legislation like the original story details, but that legislation isn't grounded in reality, and if it's actually enforced with real consequences, the platforms will eventually just decide to shut features like livestreaming and instant video sharing down. That tradeoff is terrible, and I think most people agree with that.

            4 votes
      2. [4]
        Wes
        Link Parent
        That's a pretty substantial difference. A news station makes a deliberate decision to run any and all content. They have the opportunity and skill to filter that content ahead of time. Facebook...

        Imagine a news station had run that footage for half an hour through some public submission system. Would anyone defend them? So what makes half an hour for Facebook becoming aware of the video a good metric?

        That's a pretty substantial difference. A news station makes a deliberate decision to run any and all content. They have the opportunity and skill to filter that content ahead of time.

        Facebook can't possibly manually review all content ahead of time. They can try and write clever algorithms to detect violating content, or they can act on reports, but the work force required to review everything would be astronomical.

        For the amount of content shared on Facebook, acting in 30 minutes seems quite respectable to me.

        5 votes
        1. [4]
          Comment deleted by author
          Link Parent
          1. papasquat
            Link Parent
            Why in the world would facebook operate a service they would almost definitely lose money on? Paying a person to watch every livestream with a 30 second, or whatever it would be, delay so that...

            If Facebook wants to continue reaping astronomical profits, I’m fine with requiring they perform an astronomical amount of work under penalty of law.

            Why in the world would facebook operate a service they would almost definitely lose money on? Paying a person to watch every livestream with a 30 second, or whatever it would be, delay so that they can screen it for content would obviously be unprofitable, and would place an extreme amount of legal liability in the hands of some person they would most likely hire off the street for 10 bucks an hour. If laws like this existed, what company in their right mind would ever allow livestreaming of any kind?

            5 votes
          2. Wes
            Link Parent
            Which is, of course, not viable for anyone. It doesn't get easier at Facebook's scale; it gets harder. Essentially live user content could not exist under your proposed system. Consider that at...

            If Facebook wants to continue reaping astronomical profits, I’m fine with requiring they perform an astronomical amount of work under penalty of law.

            Which is, of course, not viable for anyone. It doesn't get easier at Facebook's scale; it gets harder. Essentially live user content could not exist under your proposed system.

            Consider that at other levels. Should your ISP be required to monitor everything you look up? Should Deimos be required to manually approve every comment on Tildes? No, because then Tildes could not exist, and a free internet could not exist.

            5 votes
          3. Luna
            Link Parent
            They assume everyone is acting in good faith and abiding by the TOS, similar to how your email providers and mail carriers assume you aren't sending illegal content until they get a court order. I...

            So does Facebook, only their decision is to run literally any and all content

            They assume everyone is acting in good faith and abiding by the TOS, similar to how your email providers and mail carriers assume you aren't sending illegal content until they get a court order. I believe the issue here is should companies be required to screen everything or can they try to use automated methods and update them as new incidents unfold. The internet largely operates using the later method today, and unless companies hire millions of lawyers to work 24/7, the former is impossible to do at the scale social media sites operate.

            2 votes
  2. DonQuixote
    Link
    I see the speed at which society is moving, not to mention its increasing acceleration, as the real elephant in the room here. We've reached a point where the only players that can react fast...

    "No-one expects the government to be experts on everything and that's why they should consult with the community and experts to develop policy that is reflective of not just community sentiment, but also practical realities and expert opinion," Sulston said.
    "The fact that this legislation was rushed through at great haste with no consultation really underlies the fact that it is technically poor and very vague."

    I see the speed at which society is moving, not to mention its increasing acceleration, as the real elephant in the room here. We've reached a point where the only players that can react fast enough to maintain 'order' are the algorithms. Government and even corporations and Adam Smith's 'unseen hand' are too slow to keep up. This will lead to decisions such as this one as well as others where no one person or group is an expert, and there is simply no time for a committee to meet and make a decision before the rules of the game have changed again.

    Recently I read an article stating that sending humans on interstellar trips didn't make sense if if would take more than 50 years, because in that time technology would have advanced to the point where later missions could get there before the first humans got there.

    It seems we're headed inexorably toward the time when physical humans and groups of humans are too slow to maintain the governance and viability of technological societies. Uploading the mind into a pre-21st century virtual space may soon be our only course of action.

    3 votes