Facebook will hire 3,000 people around the world to monitor videos and posts for violent or criminal acts, and potentially prevent tragedies from occurring.
The social-media site has faced calls to do more, and respond faster, after a murder and a suicide were recently shown live. The new employees, who will be added over the next year, will join 4,500 people already on Facebook’s content moderation force.
The problem will eventually be solved when computers can reliably determine the content and context of video. For now, a human touch is needed, Chief Executive Officer Mark Zuckerberg wrote in a Facebook post Wednesday.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
Video, both live and in posted clips, is crucial to Facebook’s future as it looks to video ads to make up for an expected slowdown in revenue growth. But Facebook has had to grapple with the dark side of video as users widely shared several graphic videos on its network in the past several months — including a spate of live-streamed suicides, rapes and the real-time confessions of a man who posted a video of himself gunning down a Cleveland man.
Facebook declined to comment on where the workers will be stationed. The company doesn’t say how many Facebook Live videos are posted each day but confirmed that 1 in 5 videos on the site is a live broadcast.
The new reviewers “will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” Zuckerberg said. Facebook will keep working with community groups — such as suicide prevention groups — and law enforcement to offer assistance to those who post or are seen in the videos who may need help, he said.
For more news updates Follow and Like us on Facebook