MENLO PARK, Calif. — Facebook has launched a civil rights task force and an election monitoring center to guard against interference in the 2020 presidential campaign and census, the tech giant said Sunday.
The Facebook task force, chaired by COO Sheryl Sandberg, follows pressure from civil rights groups and minorities who say the company has not done enough to combat anti-democratic tactics such as voter intimidation and suppression. The U.S. election team will be established by the end of the year.
The moves underscore worries of a new boom in political interference as the United States enters a pivotal campaign season. And it reflects a growing push by Facebook to build decision-making structures inside and outside the company that can show it is capable of responsibly handling disinformation and hate speech and safeguarding user data.
Sunday’s announcement came as a leading civil rights expert released her second interim report in a multi-year audit of the social media company. The report, a copy of which was reviewed by CNN, said Facebook has improved in the way it incorporates civil rights concerns into its products, but raised questions about the long-term durability of those efforts. “As the largest social media company in the world, what Facebook has committed to here is a consequential and important start,” wrote report author Laura Murphy, a former legislative director of the American Civil Liberties Union. “But only if it continues to build upon what it is announcing today.”
Facebook promises to roll out a range of additional policies in the coming months reflecting the report’s recommendations. Advertisements trying to persuade users not to vote will soon be forbidden, for example. The company will restrict how housing, credit and job advertisers may target users so as to prevent discrimination by age and gender, which has repeatedly been a problem on the platform despite Facebook’s promises to address it. And it will launch a new policy in the fall banning any content aimed at misleading users about the upcoming census.
As with Facebook’s struggle to catch disinformation more broadly, artificial intelligence will figure prominently in that effort, Sandberg said in a blog post. “We’re going to treat next year’s Census like an election,” she wrote, “with people, policies and technology in place to protect against Census interference.”
The company also vowed on Sunday to tighten its policies surrounding offensive material and content moderation. Facebook will “very soon” consider closing a loophole that currently permits hate speech to remain on the platform if it claims to be humor, said Neil Potts, Facebook’s director of public policy.
Meanwhile, the company is testing changes to the content moderation tools used by Facebook’s human reviewers. The updates, Facebook said, could make it less likely that critics of hate speech are wrongly penalized for citing examples of it in their own posts.
Moderators themselves may see a reprieve under Facebook’s new policies. In recent months, numerous reports have highlighted the toll that repeatedly viewing violent, hateful or graphic words and images has taken on the moderators, who are often contractors rather than Facebook employees. Facebook will be seeking to amend its vendor contracts to ensure those workers actually access the mental health resources that are provided to them, spokeswoman Ruchika Budhraja told CNN.
Civil rights activists said Sandberg’s recent personal involvement with the audit helped raise the profile of their complaints. “This is a much better step forward than the last release in December,” said Rashad Robinson, president of the advocacy organization Color of Change, who was briefed on the report and Facebook’s new policies. “There is more intention. There is quicker turnaround.”
But, Robinson said, outside groups will be watching closely to see if Facebook’s promises produce results. “That task force needs to have real teeth,” he said. “It needs to be substantive. It needs to recognize that the attempts to test these changes will come from those who have used the platform to sow hate and division.”
Under Facebook’s plan, the civil rights task force will meet monthly and serve as a conduit for employees and outside groups to raise concerns. The group includes officials from internal departments including product development, advertising and diversity. And Facebook will bring on outside civil rights experts to consult on the company’s work, the report said.
The report comes days after company CEO Mark Zuckerberg discussed a proposed external oversight board with the power to review Facebook’s content moderation decisions. Speaking at the Aspen Ideas Festival on Wednesday, Zuckerberg said the independent body could enhance Facebook’s accountability by creating “some separation of powers.”
“We’re starting this as a project just for Facebook, but over time I could see expanding that so that more of the industry joins,” he said.
The proposal underscores how Facebook, with its more than 2.3 billion users, often resembles a country — but one with only a nascent governance structure.
Facebook is seeking to build that structure up rapidly, as regulators around the world increasingly look to investigate or penalize the company for alleged violations of privacy and competition laws. Some politicians have called for Facebook to be broken into pieces. On Wednesday, Zuckerberg pushed back on on that idea, arguing that splitting Facebook apart would simply make it more difficult to meet the challenge posed by hate speech and disinformation.