Mark Zuckerberg, the CEO of Facebook said that this would be in addition to the 4,500 people already working in this capacity. What is not clear is whether these are full-time employees or contractors, and how screeners will, in effect, be screened, according to TechCrunch.
It’s a big hiring move, but is it enough? The company currently has close to 2 billion users — we’ll be getting an update on that number later today when Facebook posts its quarterly earnings (interesting timing to release this on just ahead of those) — and Zuckerberg said that there are “millions of reports” received every week.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote in a post earlier today. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
The move to add more human curation into the mix is nevertheless a step in the right direction. To date, the company has put more emphasis on building algorithms and mechanisms for people to report on friends or even themselves if they are concerned. In March, it launched a new set of suicide prevention tools. A month later it new tech to combat revenge porn.
While the reviewers will be given a role to speed up some of the bottleneck that results in the gap between reporting content and that content being taken down, Facebook said it will continue to work with authorities and also continue to invest in — yes — more technology.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer,” Zuckerberg wrote.
As the world’s biggest social network, Facebook has held a contentious place in the ongoing debate about what role social media is playing in how information is spread around the world today.
During recent political events like the U.S. elections and the referendum in the U.K., many accused Facebook of helping to disseminate inaccurate spins, or downright false information, about the issues at hand and candidates, connecting that dissemination directly to the outcomes of those voting events.
Facebook has on the one hand claimed to have a statistically small influence, while at the same time recently pledging to do more to try to combat the spread of false information on its platform, both with more human curation and the creation of more tools for people to flag and take the edge off of posts intentionally created to be incendiary and viral, but perhaps not ultimately to enlighten anyone at all.