Should Facebook Add An 'Emergency' Function To Report Live-Streaming Abuses? A County Commissioner Leads The Charge
By Stephen Gossett in News on Apr 19, 2017 4:57PM
Getty Images / Photo: Sean Gallup
In the wake of perhaps the most gruesome instance yet of a crime being broadcast on Facebook Live, a Cook County Commissioner is calling on the social-media giant to add an "emergency" reporting function and keep outside parties from extracting clips of questionable content.
Commissioner Richard Boykin on Monday sent a letter addressed to Facebook founder Mark Zuckerberg requesting the changes. Boykin drafted the request following the grisly, high-profile murder of Robert Godwin Sr., 74, by gunman Steve Stephens in Cleveland on Sunday, which the shooter recorded and posted on Facebook.
Here in Chicago, several crimes have been broadcast on Facebook Live in the recent past—some that simply happened to occur while a victim was streaming, others in which broadcasting the incident was intentional. A 15-year-old girl was sexually assaulted last month in an attack that Supt. Eddie Johnson said was viewed by at least 40 people, none of whom contacted police. Four people face hate crime charges in the notorious Facebook Live torture case from January, in which suspects streamed video of an assault of a bound schizophrenic victim. A 2-year-old boy and his uncle were shot and killed on Valentine's Day while the man's pregnant girlfriend, who was also shot, was broadcasting.
"Facebook has an obligation and a responsibility to be a good corporate citizen by not allowing tragedies to be live streamed," Boykin wrote in his letter to Facebook.
He called on the company to implement the following two changes, emphasis ours:
"Facebook should create an “Emergency” button that is associated with its live streaming product that will allow users to alert Facebook administrators of potentially life-threatening activity being streamed on the platform. This button could signal Facebook’s algorithm to halt the streamers video feed.Facebook should implement technology to prevent and block third party software from extracting videos containing criminal activity, unless these videos are being utilized to inform the perspectives of law enforcement officials."
Boykin told Chicaogist that Facebook replied that they had received the letter and will get back with an appropriate response, but they did not offer an ETA.
"We're going to keep the drumbeat on them, they're going to have to so something... Social media is helping to drive the violence in Chicago," Boykin told Chicagoist.
"These videos should not stay on their platform for two to three hours," he added.
Boykin also mentioned last year's resolution that chartered a social-media gang task force, noting that social media "has played a major role in driving threats and retaliation."
Facebook does currently have monitoring capabilities. Users can flag objectionable content, but the process of removal can be long and flawed.
"Because Facebook’s network of users is so vast, the company relies on a combination of artificial intelligence, human moderators and alerts from users to flag objectionable content. If many people report an instance of offensive or harmful content at once, Facebook’s algorithms will show the post to a global team of human content moderators, who will review it and decide if it violates Facebook’s terms of service."
At the same time, Facebook remains perpetually under the microscope for having flagged and/or removed content that serves a social or artistic function—like the acclaimed Chicago photographer who saw her profile deleted "in error" last year. Add the giant ball of wax that is "fake news" and the overall issue appears trickier still.
But at least in terms of Facebook Live video, Boykin argues that the braintrust should be able to negotiate a fix that doesn't unduly censor users. "They're a private company," he said. "They don't have the same rules government has. Nobody is allowed to yell 'fire' in a crowded theater. They can police content on their platform."
"They have some of the smartest people working for them" who can develop such a solution, he said.
Boykin said Facebook has a "moral obligation" to address the issue and that his office plans to keep the pressure on. Facebook did not immediately return a request for comment. This post will be updated as necessary.