As much as I would hate to defend facebook. 29 minute response time is remarkably fast. If no one actually reported the video how do we expect facebook to have done anything? AI isn't nearly good...
As much as I would hate to defend facebook. 29 minute response time is remarkably fast. If no one actually reported the video how do we expect facebook to have done anything? AI isn't nearly good enough to do this work yet.
With this logic we should also be upset at the NZ police who enabled him to do the shooting because they didn't arrest him the day before.
They didn't respond in 29 minutes. 29 minutes was how long it took for them to even find out about the video because someone had manually reported it at about 14:09 (local time). However, they...
29 minute response time is remarkably fast.
They didn't respond in 29 minutes. 29 minutes was how long it took for them to even find out about the video because someone had manually reported it at about 14:09 (local time). However, they didn't act on it until a further 20 minutes after that (about 14:30), when the police used a special channel to report it. The video was then removed within a few minutes.
The video ran for a full 17 minutes, but "The $US473 billion ($A663 billion) company's vaunted artificial intelligence technologies, designed to stop extremist and illicit content such as pornography or videos depicting violent acts from appearing on the platform, failed to detect anything objectionable in the Christchurch shooting stream." Somehow, a video which graphically displays multiple people being shot was not able to be identified as violent by an artificial intelligence program specifically designed to recognise violent videos. This wasn't a subtle video, which only alluded to violence. It depicted violence up-close and personal, in a very unambiguous fashion. Even artificial stupidity should have been able to identify that, let alone artificial intelligence.
And then, to not act on a normal user's report but respond quickly to a police report shows just how seriously Facebook takes user reports. If you don't have access to the "special escalation channel for law enforcement and intelligence agencies", Facebook is going to take its own sweet time reviewing your report. We're not talking about volunteer moderators, who aren't required to be online all the time. We're talking about paid teams of moderators, whose sole job is to respond to these reports.
With this logic we should also be upset at the NZ police who enabled him to do the shooting because they didn't arrest him the day before.
Noone's saying Facebook should have blocked the video the day before it was posted. However, it's quite illuminating that police who had to physically travel from one place to another were able to get to the shooter's location and apprehend him quicker than Facebook was able to take down a video. Police were on site in about 7 minutes, and had arrested him within 20 minutes of his first shot (about 14:01), while Facebook still didn't even know what was happening. Last time I checked, electronic signals travel much faster than police cars.
As much as I would hate to defend facebook. 29 minute response time is remarkably fast. If no one actually reported the video how do we expect facebook to have done anything? AI isn't nearly good enough to do this work yet.
With this logic we should also be upset at the NZ police who enabled him to do the shooting because they didn't arrest him the day before.
They didn't respond in 29 minutes. 29 minutes was how long it took for them to even find out about the video because someone had manually reported it at about 14:09 (local time). However, they didn't act on it until a further 20 minutes after that (about 14:30), when the police used a special channel to report it. The video was then removed within a few minutes.
The video ran for a full 17 minutes, but "The $US473 billion ($A663 billion) company's vaunted artificial intelligence technologies, designed to stop extremist and illicit content such as pornography or videos depicting violent acts from appearing on the platform, failed to detect anything objectionable in the Christchurch shooting stream." Somehow, a video which graphically displays multiple people being shot was not able to be identified as violent by an artificial intelligence program specifically designed to recognise violent videos. This wasn't a subtle video, which only alluded to violence. It depicted violence up-close and personal, in a very unambiguous fashion. Even artificial stupidity should have been able to identify that, let alone artificial intelligence.
And then, to not act on a normal user's report but respond quickly to a police report shows just how seriously Facebook takes user reports. If you don't have access to the "special escalation channel for law enforcement and intelligence agencies", Facebook is going to take its own sweet time reviewing your report. We're not talking about volunteer moderators, who aren't required to be online all the time. We're talking about paid teams of moderators, whose sole job is to respond to these reports.
Noone's saying Facebook should have blocked the video the day before it was posted. However, it's quite illuminating that police who had to physically travel from one place to another were able to get to the shooter's location and apprehend him quicker than Facebook was able to take down a video. Police were on site in about 7 minutes, and had arrested him within 20 minutes of his first shot (about 14:01), while Facebook still didn't even know what was happening. Last time I checked, electronic signals travel much faster than police cars.