Facebook's 'unpleasant underbelly' policed by thousands of content reviewers worldwide - Action News
Home WebMail Friday, November 22, 2024, 02:40 PM | Calgary | -10.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Facebook's 'unpleasant underbelly' policed by thousands of content reviewers worldwide

Content reviewing is one of the fastest-growing, entry-level job sectors in Silicon Valley as social media platforms fight to rid their platforms of ever-growing amounts of toxic content.

Facebook plans to expand team working on safety and security to 20,000 this year

Sarah Katz is a 27-year-old cyber security expert who worked with Facebook in Palo Alto, Calif., in 2016. She says some of the content she reviewed for the platform was 'disturbing.' (Sylvia Thomson/CBC)

Sarah Katz's eyes dartaround the Palo Alto, Calif., coffee shop.

"Am I OK to speak here?" she said, not wanting to offend anyone within earshot with what she was about to describe. "I don't want to, like, bother people."

Katz is a 27-year-old self-describedformer "spam analyst"who worked on contract with Facebook in 2016. She spent her days scanning flagged content, deciding whether postsmet Facebook's standards and should be kept as is on the platform orwere so disturbing that they should be deleted.

"Primarily pornography, sometimes bestiality, child pornography," she said, as she described the worst of the up to 8,000 posts she scanned every day.

Some stuck with her.

"There was a girl around 12 and a little boy, like nine, and they were standing facing each other and they didn't have pants on. And there was someone off-camera who spoke another language," she said.

"So he's probably just telling them what to do. So that was disturbing."

Katz was part of one of the fastest-growing, entry-level job sectors in Silicon Valley, that of content reviewer. Twitter, YouTubeand Facebook are all fighting to rid their sites of ever-growing amounts of toxic content.

James Mitchell, director of risk and response at Facebook, says the company faces huge challenges combatting the growing types of abusive content on the platform. (Jason Burles/CBC)

Facebook began as a site for university students, but has grown into the largest social media platform in the world. With that growthcomes huge challenges, saidJames Mitchell, director of risk and response at Facebook headquarters in Menlo Park, Calif.

"One of the big changes we saw was how the content became substantially more global in nature, and we began seeing substantially more types of abuse on the platform and substantially greater volumes.And we really had to grow and scale our teams to be able to combat that," he said.

"The world is changing around you, and the way people are using the product is changing," he added.

"So that means you always have this evolving process of trying to figure out the best ways to keep the platform safe."

A building is seen inside the Palo Alto, Calif., headquarters of Facebook. (Sylvia Thomson/CBC)

Consider this gamut of troubling content:

  • A United Nations report found Facebook "substantively contributed to the level of acrimony and dissension and conflict" during the Rohingya crisis in Myanmar.
  • The immediate aftermath of Philando Castile's shooting by a Minnesota police officer was broadcast on Facebook Live by his girlfriend.
  • Student survivors of the Parkland shooting, such as David Hogg, were portrayed as "crisis actors" in fake posts.
  • Alek Minassian, the suspect in the Toronto van attack in April that killed 10 pedestrians and injured 16,allegedly posted about an "IncelRebellion" before the incident. Facebooklatershut down his account.
  • The Russian propaganda group Internet Research Agency was accused of using trolls on the platform to influence the U.S. election.

While artificial intelligence can tackle a lot of the posts that are created by fake accounts, humans are still key to making tricky ethical decisions.

Facebook had 4,500 people on the job last year and 7,500 work on it now. The companyplans to increase the team responsible for safety and security to 20,000 this year -- many of whom will be content reviewers.

Much of the work iscontracted out to third-party partners, staffing up in places such as India and thePhilippines.

Facebook reviewers work around the world and in various languages. The idea is to have people who are aware of various cultural differences and norms, and the Asia Pacific area is the largest region for new Facebook users.

A new documentary,calledThe Cleaners,shows the toll the work takes on the reviewers in a third-party company in Manila. One reviewersaid he had watched "hundreds of beheadings." Another said she'd go home thinking about pornography after seeing it so much at work.

A sign with a 'like' symbol is seen outside of Facebook headquarters in Menlo Park, Calif. (Sylvia Thomson/CBC)

It's unclear what kind of support these outsourced workers get, though Facebook said all employees who are reviewing content get "wellness breaks," training videos and psychological help.

"We try to ensure that everybody gets and has resources for psychological counselling," said Mitchell. "We think about the wellness of people that are working on these issues.

"The reality is they know there is value that they're adding for people on the site. They know they are preventing bad actions from happening to people. If one of the things you do is review live videos for suicide and self harm, you actually have the ability to potentially save a life."

But Mitchell wouldn't give details about howmany staffers doing the work are hired by third-party partners.Nor would he talk about how many are based where.

"They're hiding the debate," said The Cleaners filmmaker Moritz Riesewieckat a recent Toronto screening.

"They're hiding the dilemma they are facing in building these platforms, and not being responsible for what goes on these platforms."

UCLA Assistant Professor Sarah Roberts questioned how Facebook can 'reasonably adjudicate' a platform with billions of users, even with 20,000 content reviewers. (Anand Ram/CBC)

Sarah Roberts, a UCLA assistant professor who is writing a book on the topic, said this is the"unpleasant underbelly" of the social media platform.

"I mean, we are talking about billions of posts per day when it comes to Facebook. We're talking about 400 hours of video content per minute, 24/7," she said.

"So this amount is vast. But really, even 20,000 workersI mean, how can theyreasonably adjudicate a platform of billions of users?"

In the first quarter of 2018, Facebook pulled down 21 million pieces of adult nudity or pornographyand 3.5 million incidentsof graphic violence the majority of which was flagged by artificial intelligence.

For hate speech, technology doesn't quite do the trick: 2.5 million pieces of hate speech were pulled down in the same period mostly by human reviewers.

"From the perspective of content reviewers, we have always played that policeman role, and so the dynamic nature of content that's being shared on our platform will continue to create challenges for us," said Mitchell.

"The other big wildcard is just the way the world continues to evolve. So much of what we do is dependent on what people are sharing, and that's changing every few months."


Watch Susan Ormiston's story from The National about Facebook's efforts to moderate what people post:

Inside Facebook's effort to purge troubling content

6 years ago
Duration 13:40
With the company facing increasing scrutiny over the spread of fake news and hate speech, Facebook says it is devoting more resources to deal with banned content on its site.

With files from Susan Ormiston and Simi Bassi