Facebook whistleblower: Five revelations about the social media giant made by former employee

Mark Zuckerberg is under pressure like never before as Congress considers how and if it can regulate platforms like Facebook, ITV News Correspondent Robert Moore reports


Facebook's products "harm children, stoke division and weaken our democracy" a former employee turned whistleblower has claimed in a series of bombshell revelations.

Frances Haugen, a former product manager for the tech giant, appeared before Congress on Tuesday to share information on her former employer.

The first set of allegations came just 24 hours before Facebook, WhatsApp and Instagram went down across the world for several hours leaving more than 3 billion users without access to the services on Monday.

Ms Haugen revealed her identity in a “60 Minutes" CBS interview before appearing before the US Congress.

Explaining her motivation for doing so, the 37-year-old told Congress: "I came forward at great personal risk because I believe we still have time to act. But we must act now."

The ex-employee came forward with tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.

So what are the key claims she has made against Facebook?

Internal research shows Instagram is harmful to teenagers

Ms Haugen accused the company of being aware of Instagram causing apparent harm to some teens and knowing can be "toxic" for young girls.

She claimed leaked internal research demonstrates this, but publicly, the company downplayed the negative impacts of the photo-sharing platform.

Research showed peer pressure caused by the visually-focused platform led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts.

“Facebook knows that they are leading young users to anorexia content," Ms Haugen said.

One internal study found 13.5% of teenage girls said Instagram makes thoughts of suicide worse and 17% said it makes eating disorders worse.

Ms Haugen likened Facebook and Instagram to Big Tobacco because the platforms give teenagers and young people "little dopamine hits" when they receive a "like" for a post.

"It’s just like cigarettes," she said. "Teenagers don’t have any self-regulation. We need to protect the kids.”



Algorithm changes caused more division

Ms Haugen said a 2018 change to the content flow on users' news feeds led to more division and hate on a network that was supposed to bring people closer together.

Despite the enmity that the new "dangerous" algorithms were feeding, she said Facebook found that they helped keep people coming back to their platforms.

The data expert, who focused on algorithm products during her time at the company, claimed this pattern helped the social media giant sell more digital adverts that are responsible for generating most of its revenue.

Campaigners placed cut out figures of the Facebook CEO on the lawn outside the US Capitol. Credit: Kevin Wolf/AP/AVAAZ

Bosses are not tackling online hate and are "intentionally hiding" information

She claimed Facebook’s own research shows that it amplifies hate, misinformation and political unrest - but the company "intentionally hides vital information from the public, from the US government and from governments around the world".

Ms Haugen alleged the company cannot properly tackle misinformation because it is reliant on cheaper artificial intelligence to moderate the platforms, with only between 10% to 20% of posts spreading misinformation being identified.

She also claimed Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump in the presidential elections last year.

Ms Haugen alleged that contributed to the deadly January 6 assault on the US Capitol.

After the November election, Facebook dissolved the civic integrity unit where Ms Haugen had been working.

That, she says, was the moment she realised “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Ms Haugen told Congress the government needs to step in to oversee the company Credit: AP

Facebook prioritises profit over safety

Ms Haugen claimed in her written testimony that bosses "know" exactly how to make Facebook and Instagram safer for uses but they won’t make the necessary changes because they put their "astronomical profits before people".

"I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolves these conflicts in favour of its own profits," claimed Ms Haugen.

"The result has been more division, more harm, more lies, more threats and more combat. In some cases... this dangerous online talk has led to actual violence that harms and even kills people."

She urged: “Congressional action is needed. They won’t solve this crisis without your help.”

Facebook's co-founder Mark Zuckerberg has been criticised for his app breaching user privacy and safety. Credit: PA

The "buck stops with Mark Zuckerberg"

Ms Haugen suggested there is no one else who can hold Facebook's CEO and co-founder Mark Zuckerberg accountable, as he controls more than 50% of the company's voting shares.

She described this as a "very unique role" in the tech industry where "no similarly powerful companies are as unilaterally controlled" and that the government needs to step in with stricter oversight of Facebook.

Ms Haugen added that she believed Mr Zuckerberg was familiar with some of the internal research showing concerns for the potential negative impacts of Instagram on children.

"Facebook wants to trick you into thinking that privacy protections or changes to Section 230 alone will be sufficient," she said.

"While important, these will not get to the core of the issue, which is that no one truly understands the destructive choices made by Facebook, except Facebook."

The data expert added that she does not think Facebook set out to build a destructive platform - but "in the end, the buck stops with Mark".


What has Facebook said?

In response to Ms Haugen's claim of research showing the potential harm of Instagram to teens, Facebook said this was "inaccurate" and that internal and external studies "found teens report having both positive and negative experiences with social media".

"The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced," said a spokesperson.

Facebook added that it has "invested heavily in people and technology to keep our platform safe" and that fighting misinformation is a "priority".

“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago," added the spokesperson.

"We have a strong track record of using our research — as well as external research and close collaboration with experts and organizations — to inform changes to our apps.”

Facebook said protecting the community is "more important than maximizing profits" and to "suggest we encourage bad content and do nothing is just not true.”

"To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016," added a spokesperson.