Enlarge /. Facebook's Menlo Park, California headquarters from 2017.
Facebook's software systems are increasingly able to detect and block hate speech on both the Facebook and Instagram platforms, the company said today. The hardest work has yet to be done by the people, and many of these people warn that the world's largest social media company is putting them in unsafe working conditions.
About 95 percent of hate speech on Facebook is captured by algorithms before anyone can report it, Facebook said in its latest report on community standards enforcement. The remaining 5 percent of the approximately 22 million tagged posts in the last quarter were reported by users.
This report also tracks a new metric for hate speech: prevalence. To measure prevalence, Facebook takes a sample of content and then looks for how often what they measure – in this case, hate speech – is viewed as a percentage of the content displayed. Between July and September of that year, the number was between 0.10 and 0.11 percent, or about 10-11 in 10,000 views.
Facebook also stressed – both in its press release and in a call to the press – that while its internal AI advances in various categories of content enforcement, COVID-19 continues to affect its ability to moderate content.
"As the COVID-19 pandemic continues to disrupt our content review workforce, we are seeing some enforcement metrics returning to pre-pandemic levels," the company said. "Even with a reduced review capacity, we still prioritize the most sensitive content that needs review, including areas like suicide, self-harm, and child nudity."
The reviewers are critical, Guy Rosen, vice president of integrity at Facebook, told the press in a phone call. "People are an important part of the equation for content enforcement," he said. "These are incredibly important people who do an incredibly important part of their job."
Full-time Facebook employees who are employed by the company are expected to work from home until July 2021 or even permanently.
Speaking to reporters, Rosen emphasized that Facebook employees who have to come to physical work, e.g. B. Those who manage essential functions in data centers are treated with strict safety precautions and personal protective equipment such as hand sanitizer. made available.
Moderation, said Rosen, is one of those tasks that can't always be done at home. Some content is just too sensitive to review outside of a dedicated work area where other family members might see it, he explained, saying that some Facebook content moderators are being brought back into offices "to ensure the balance between People and AI work. " the areas "that require human judgment are applied.
However, the majority of Facebook's content moderators do not work for Facebook. They work for contractors worldwide, often with completely inadequate support to get their job done. Reporters from The Guardian, The Verge, the Washington Post, and BuzzFeed News, among others, have spoken to these contract workers around the world who describe relentless expectations and widespread trauma at work. Earlier this year, Facebook agreed to a $ 52 million settlement in a class action lawsuit filed by former content moderators who alleged the job gave them a "debilitating" post-traumatic stress disorder.
All of that was before COVID-19 spread around the world. The situation looks even worse given the pandemic. More than 200 moderators, told to go back to the office, signed an open letter accusing Facebook of "unnecessarily risking the moderators' lives" without paying any risk to workers returning to the office be ordered.
"In addition to the psychologically toxic work, sticking to the job means going into a hot zone," the letter said. "Multiple COVID cases have occurred in multiple offices. Employees have asked the Facebook leadership and the leadership of their outsourcing companies like Accenture and CPL to take urgent action to protect us and value our work. They declined. We are publishing this letter because we have no other choice. "
"This raises a stark question," the letter adds. "If our work is so central to Facebook's business that you're asking us to risk our lives – and make a profit – on behalf of the Facebook community – aren't we actually the heart of your business?"
Meanwhile, state and federal control of Facebook continues to grow. This week, company boss Mark Zuckerberg testified in front of the Senate for the second time in just three weeks. Members of the House also complain that Facebook has failed to properly or safely moderate content in the face of rampant disinformation surrounding elections.
Other regulators are likely to come for Facebook – and soon. According to media reports, many of the antitrust investigations that began in 2019 are coming to a close. The Federal Trade Commission reportedly plans to file a lawsuit within the next two weeks, and a coalition of nearly 40 states led by New York Attorney General Letitia James is expected to follow suit in December. These lawsuits are likely to argue that Facebook is unfairly stifling competition through its acquisition and data strategies, and possibly seeking to force the company to divest Instagram and WhatsApp.