Haugen: Facebook should not be allowed to "mislead" its Oversight Board

Internal Facebook documents revealed

By Clare Duffy, Aditi Sangal, Melissa Mahtani and Meg Wagner, CNN

Updated 5:55 p.m. ET, October 26, 2021
12 Posts
Sort byDropdown arrow
10:37 a.m. ET, October 25, 2021

Haugen: Facebook should not be allowed to "mislead" its Oversight Board

From CNN's Charles Riley

Facebook should not be allowed to mislead its Oversight Board, whistleblower Frances Haugen told UK lawmakers on Monday, saying the body should demand more transparency from the social media giant.

"This is a defining moment for the Oversight Board. What relationship does it want to have with Facebook? I hope the Oversight Board takes this moment to stand up and demand a relationship that has more transparency," said Haugen.

"If Facebook can come in there and just actively mislead the Oversight Board — which is what they did — I don’t know what the purpose of the oversight board is," she added.

The Facebook Oversight Board is made up of experts in areas such as freedom of expression and human rights. They were appointed by the company but operate independently. The board is often described as a kind of Supreme Court for Facebook.

Last week, the board said that Facebook failed to provide crucial details about its "Cross-Check" program that reportedly shielded millions of VIP users from normal content moderation rules.

Facebook uses Cross-Check to review content decisions relating to high-profile users, such as politicians, celebrities and journalists. The program had mushroomed to include 5.8 million users in 2020, according to the Wall Street Journal.

On Sunday, Oversight Board member and PEN America CEO Suzanne Nossel said that Facebook must do a better job with transparency.

The board adjudicates cases on controversial content that is both left up or taken down — but these cases are just "the tip of the iceberg" when it comes to oversight at Facebook, Nossel said on CNN's "Reliable Sources."

"They didn't want to bear the full weight of responsibility for big questions like, 'Should Donald Trump be allowed on the platform,'" Nossel said. "I think they're very ambivalent about regulation."

10:19 a.m. ET, October 25, 2021

Facebook's success was built on algorithms. Can they also fix it?

From CNN's Rachel Metz

Billions of people around the world see relevant content all around the internet with the help of algorithms. It's not unique to Facebook.

However, Facebook whistleblower Frances Haugen's testimony earlier this month and submitted documents have renewed scrutiny of the impact Facebook and its algorithms have on teens, democracy and society at large.

The fallout has raised the question of just how much Facebook — and perhaps platforms like it — can or should rethink using a bevy of algorithms to determine which pictures, videos and news users see.

Algorithms are not going away. But there are ways for Facebook to improve them, experts in algorithms and artificial intelligence told CNN. It will, however, require something Facebook has so far appeared reluctant to offer (despite executive talking points): more transparency and control for users.

Rethinking focus on engagement

A big hurdle to making meaningful improvements is social networks' current focus on the importance of engagement, or the amount of time users spend scrolling, clicking, and otherwise interacting with social media posts and ads, experts say.

Haugen revealed internal documents from Facebook that show the social network is aware that its "core product mechanics, such as virality, recommendations and optimizing for engagement, are a significant part" of why hate speech and misinformation "flourish" on its platform.

"Engagement is not a synonym for good mental health," said James Mickens, a computer science professor at Harvard and co-director of the Berkman Klein Center's Institute for Rebooting Social Media

Changing this is tricky though several agreed that it may involve considering the feelings users have when using social media and not just the amount of time they spend using it.

In the past, some might have said it would require pressure from advertisers whose dollars support these platforms. But in her testimony, Haugen seemed to bet on a different answer: pressure from Congress.

10:27 a.m. ET, October 25, 2021

Whistleblower: Facebook views safety as a cost center instead of growth center

From CNN's Aditi Sangal

Former Facebook employee and whistleblower Frances Haugen told the UK parliament that the company views safety as a cost center and not an investment for growth.

"I think there is a view inside the company that safety is a cost center; it's not a growth center, which, I think, is very short-term in thinking. Because Facebook's own research has shown that when people have worse integrity experiences on the site, they are less likely to retain," she said Monday.

She urged British lawmakers to put regulations in place, saying it was for the good of the company's long-term growth.

"I think regulation could actually be good for Facebook's long-term success. Because it would force Facebook back into a place where it was more pleasant to be on Facebook," she said.

Remember: When Haugen testified in the US Congress, she urged the lawmakers to step in and create regulations, too.

10:14 a.m. ET, October 25, 2021

Haugen says Facebook has a "huge weak spot" when it comes to reporting issues up the chain

From CNN's Charles Riley

Former Facebook employee and whistleblower Frances Haugen told UK lawmakers that the social media giant doesn't devote enough resources to critical areas, such as national security and public safety.

Facebook has a "huge weak spot" when it comes to reporting issues up its chain of command, Haugen said in testimony on Monday.

"If I drove a bus in the United States, there would be a phone number in my break room that I could call that would say ‘Did you see something that endangered public safety? Call this number,'" she said.

"When I worked on counter-espionage [at Facebook] I saw things where I was concerned about national security, and I had no idea how to escalate those, because I didn’t have faith in my chain of command at that point," added Haugen.

The whistleblower claimed that asking for more resources to tackle difficult issues was not part of Facebook's corporate culture.

"We were told just to accept under-resourcing," she told UK lawmakers. "There is a culture that lionizes kind of a startup ethic that, in my opinion, is irresponsible."

"I flagged repeatedly when I worked on Civic Integrity that I felt that critical teams were understaffed, and I was told at Facebook ‘we accomplish unimaginable things with far fewer resources than anyone would think possible,'" said Haugen.

10:03 a.m. ET, October 25, 2021

Whistleblower: Facebook's content safety systems don't apply similarly to non-English speaking countries

From CNN's Aditi Sangal

While testifying in the UK parliament, former Facebook employee and whistleblower Frances Haugen said one of her primary concerns is Facebook's "under-investment in non-English languages and how they mislead the public [into thinking that] they are supporting them."

"Facebook says things like, 'we support 50 languages,' when in reality, most of those languages get a tiny fraction of the safety systems that English gets," Haugen told British lawmakers.

For example, in the recent revelations from the Facebook Papers, it is clear that Facebook employees repeatedly sounded the alarm on the company's failure to curb the spread of posts inciting violence in "at risk" countries like Ethiopia, where a civil war has raged for the past year. But the documents reveal that Facebook's moderation efforts were no match for the flood of inflammatory content on its platform.

By the way, by Facebook's own estimates, it has 1.84 billion daily active users — 72% of which are outside North America and Europe, according to its annual SEC filing for 2020.

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

The highest level of existing moderation efforts could also be best suited for American English, Haugen said.

"UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually [underenforced] in the UK," Haugen explained.

9:50 a.m. ET, October 25, 2021

Facebook staff this weekend were told to brace for "more bad headlines"

From CNN’s Donie O’Sullivan 

Facebook's Vice President of Global Affairs Nick Clegg in Berlin in June 2019.
Facebook's Vice President of Global Affairs Nick Clegg in Berlin in June 2019.

Facebook Vice President of Global Affairs Nick Clegg told staff at the company to be prepared for more “bad headlines in the coming days” as a consortium of news organizations, including CNN, continue to publish stories based on a cache of tens of thousands of pages of leaked documents from the company. 

Clegg made the comments in an internal company post on Saturday, Axios first reported. A copy of the memo was obtained by CNN Sunday. 

Clegg took aim at news organizations, writing:

“Social media turns traditional top-down control of information on its head. In the past, public discourse was largely curated by established gatekeepers in the media who decided what people could read, see and digest. Social media has enabled people to decide for themselves – posting and sharing content directly. This is both empowering for individuals – and disruptive to those who hanker after the top-down controls of the past, especially if they are finding the transition to the online world a struggle for their own businesses.”

He also echoed public statements made by the company, writing, “At the heart of these stories is a premise which is plainly false: that we fail to put people who use our service first, and that we conduct research which we then systematically ignore. Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or wellbeing misunderstands what we're about, and where our own commercial interests lie."

9:54 a.m. ET, October 25, 2021

Facebook revelations are shocking — but nothing will change until Congress acts

From CNN's Allison Morrow

Public pressure alone won't get Facebook to change. If shame were enough, Facebook would have changed after the 2016 election. Or the Cambridge Analytica scandal. Or the 2020 election.

Even when dozens of major brands pulled their advertising over Facebook's lax approach to regulating hate speech, the company barely felt a ding.

So it's up to Washington to fix Facebook. And that's no easy task.

Part of the problem with regulating Facebook is that lawmakers and regulators are feeling around in the dark for a solution to a problem society has never faced before. To borrow whistleblower Frances Haugen's metaphor, it's like the Transportation Department writing the rules of the road without even knowing that seat belts are an option.

And Facebook's structure is uniquely murky, even among tech companies, according to Haugen.

"At other large tech companies like Google, any independent researcher can download from the Internet the company's search results and write papers about what they find," she said. "But Facebook hides behind walls that keep researchers and regulators from understanding the true dynamics of their system."

Read more here.

9:04 a.m. ET, October 25, 2021

The Facebook whistleblower is testifying in the UK today

Former Facebook employee and whistleblower Frances Haugen testified before a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on October 5, 2021, in Washington, DC.
Former Facebook employee and whistleblower Frances Haugen testified before a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on October 5, 2021, in Washington, DC.

Former Facebook employee-turned-whistleblower whistleblower Frances Haugen, who testified before Congress about how the social media giant misled the public, will now face questions in the UK Parliament.

Haugen is set to testify staring at 9:30 a.m. ET.

Haugen, the 37-year-old former Facebook (FB) product manager who worked on civic integrity issues at the company, revealed her identity during a "60 Minutes" segment that aired earlier this month.

She has reportedly filed at least eight whistleblower complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public. She also shared the documents with regulators and the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its apps.

In her testimony before Congress earlier this month, Haugen faced questions from a Commerce subcommittee about what Facebook-owned Instagram knew about its effects on young users, among other issues.

"I am here today because I believe that Facebook's products harm children, stoke division, and weaken our democracy," she said during her opening remarks. "The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won't solve this crisis without your help."
8:46 a.m. ET, October 25, 2021

Facebook confronts an existential crisis

From CNN's Clare Duffy

Facebook has confronted whistleblowers, PR firestorms and Congressional inquiries in recent years. But now it faces a combination of all three at once in what could be the most intense and wide-ranging crisis in the company's 17-year history. 

On Friday, a consortium of 17 US news organizations began publishing a series of stories — collectively called "The Facebook Papers" — based on a trove of hundreds of internal company documents which were included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel. The consortium, which includes CNN, reviewed the redacted versions received by Congress.  

CNN's coverage includes stories about how coordinated groups on Facebook sow discord and violence, including on January 6, as well as Facebook's challenges moderating content in some non-English-speaking countries, and how human traffickers have used its platforms to exploit people. The Wall Street Journal previously published a series of stories based on tens of thousands of pages of internal Facebook documents leaked by Haugen. The consortium's work is based on many of the same documents.

Facebook has dealt with scandals over its approach to data privacy, content moderation and competitors before. But the vast trove of documents, and the many stories surely still to come from it, touch on concerns and problems across seemingly every part of its business: its approach to combatting hate speech and misinformation, managing international growth, protecting younger users on its platform and even its ability to accurately measure the size of its massive audience.  

All of this raises an uncomfortable question for the company: Is Facebook actually capable of managing the potential for real-world harms from its staggeringly large platforms, or has the social media giant has become too big not to fail? 

Ongoing problems

The documents show various examples of issues that Facebook has been aware of, even as it still struggles with them. Take the example of a report published by the Journal on September 16 that highlighted internal Facebook research about a violent Mexican drug cartel, known as Cartél Jalisco Nueva Generación. The cartel was said to be using the platform to post violent content and recruit new members using the acronym "CJNG," even though it had been designated internally as one of the "Dangerous Individuals and Organizations" whose content should be removed. Facebook told the Journal at the time that it was investing in artificial intelligence to bolster its enforcement against such groups. 

Despite the Journal's report last month, CNN last week identified disturbing content linked to the group on Instagram, including photos of guns, and photo and video posts in which people appear to have been shot or beheaded. After CNN asked Facebook about the posts, a spokesperson confirmed that multiple videos CNN flagged were removed for violating the company's policies, and at least one post had a warning added. 

Facebook's response

Facebook, for its part, has repeatedly tried to discredit Haugen, and said her testimony and reports on the documents mischaracterize its actions and efforts.  

"At the heart of these stories is a premise which is false," a Facebook spokesperson said in a statement to CNN. "Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or wellbeing misunderstands where our own commercial interests lie." 

In a tweet thread last week, the company's Vice President of Communications, John Pinette, called the Facebook Papers a "curated selection out of millions of documents at Facebook" which "can in no way be used to draw fair conclusions about us." But even that response is telling ­­— if Facebook has more documents that would tell a fuller story, why not release them?

Read more here.