Home Objective News Today Who Is Responsible for Stopping Live-Streamed Crimes?

Who Is Responsible for Stopping Live-Streamed Crimes?

240

The horrific live-streamed rape of a 15-year-old Chicago girl on Facebook this week has once again brought up questions about the responsibility social media companies and users have to prevent or report crimes that are carried out online.

At least 40 people watched the sexual assault of a 15-year-old girl by five or six men on Facebook Live, authorities said Monday. None of the spectators called police, who only found out about the attack after the girl’s mother showed the city’s police superintendent screenshots of the video.

Chicago Police immediately contacted Facebook who took down the disturbing video.

But should the social-media giant face any criminal repercussions for hosting what amounts to child porn?

Related: Gang Rape of Chicago Teen Was Watched Live by 40 People on Facebook, No One Called Cops

“Crimes like this are hideous and we do not allow that kind of content on Facebook. We take our responsibility to keep people safe on Facebook very seriously and will remove videos that depict sexual assault and are shared to glorify violence,” a Facebook spokeswoman said Tuesday.

In January, four Chicago-area young men and women were arrested after a horrifying Facebook Live video showed them beating and torturing a bound and gagged mentally disabled man.

And last year, an Ohio woman was sentenced nine months in prison for broadcasting the rape of a 17-year-old on the live-streaming app Periscope.

But websites aren’t liable for third-party content, thanks to Section 230 of the Communications Decency Act.

The intent behind the law is to encourage free speech and uninhibited expression, noted Eric Goldman, Director of Santa Clara University Law School’s High Tech Law Institute.

“It’s been the law for 20 years, and it’s the foundation of most of the sites that we love and enjoy today,” said Goldman. “If you think about what are the sites you use the most, Google or Facebook or Twitter or Craigslist, eBay or Snapchat — chances are, they actually rely upon Section 230.”

Yet, Goldman noted: “If Facebook had knowledge that there was a live stream of a sexual violation of a child, Facebook would almost certainly be obligated to take more action than merely terminating the stream. This is based on a law that Congress enacted back in 1998 that basically requires service providers to pick up the phone if they see any evidence of child pornography. At that point, the video is a form of child pornography, yes, and at that point then the service provider would be obligated to do more than simply delete the content.”

Image: A woman poses for a photo using her smart phone


Image: A woman poses for a photo using her smart phone

A woman poses for a photo using her smart phone in Rio de Janeiro on Aug. 21, 2013. Silvia Izquierdo / AP, file

But millions of hours of video are uploaded to these platforms and monitoring each and every video is an “onerous” and “very difficult” task, he said.

“I can’t imagine Facebook knowing about [illegal content] and not taking it down,” said Daphne Keller, the Director of Intermediary Liability at the Stanford Center for Internet and Society. More likely than not, they probably aren’t aware of these videos unless someone flags them, she said.

Most sites do have automated technology tasked with moderating content, but it doesn’t always get the job right and can miss things, she said.

Live streaming gets even trickier, with a whole different set of challenges, she said. When things happen in real time, you don’t know that will happen next and it’s extremely difficult for automated technology to monitor live events, she said.

Related: Four Teens Held in Connection With Kidnapping After ‘Facebook Live’ Torture Video Airs

Which is why service providers, like Facebook, count on users to be its eyes and ears.

On its site, the company says “we rely on people like you. If you see something on Facebook that you believe violates our terms, please report it to us.”

In the case of Facebook Live, “a reviewer can interrupt a live stream if there is a violation of our Community Standards. Anyone can report content to us if they think it goes against our standards, and it only takes one report for something to be reviewed.”

As a practical matter, even though they are not required by law, Facebook does take down videos that are problematic and will work with law enforcement, Keller said.

And it’s a cooperation law enforcement generally appreciates: “These companies have always cooperated fully with investigations,” said a Chicago Police Department spokesman.

Social media companies can do better by clearly highlighting “report abuse” buttons and by training their employees very clearly on a protocol when things like crimes are flagged, said Emma Llansó, director of the Free Expression project at the Center for Democracy and Technology.

But that system only works if users use those functions and actually flag, she said.

“Users are a big part of the content moderation process,” she said.

While in most cases, users are quick to hit the flag button there is also a degree of “bystander syndrome,” Goldman said. “Many times a user thinks since so many others are also watching someone else must have taken care of it, or they simply don’t want to get involved,” he said.

But the responsibility falls on all sides, Llansó said.

Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here