Skip to main content
Clear icon
64º

Ex-Facebook manager criticizes company, urges more oversight

WASHINGTON – While accusing the giant social network of pursuing profits over safety, a former Facebook data scientist told Congress Tuesday she believes stricter government oversight could alleviate the dangers the company poses, from harming children to inciting political violence to fueling misinformation.

Frances Haugen, testifying to the Senate Commerce Subcommittee on Consumer Protection, presented a wide-ranging condemnation of Facebook. She accused the company of failing to make changes to Instagram after internal research showed apparent harm to some teens and being dishonest in its public fight against hate and misinformation. Haugen’s accusations were buttressed by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.

But she also offered thoughtful ideas about how Facebook’s social media platforms could be made safer. Haugen laid responsibility for the company’s profits-over-safety strategy right at the top, with CEO Mark Zuckerberg, but she also expressed empathy for Facebook’s dilemma.

Haugen, who says she joined the company in 2019 because “Facebook has the potential to bring out the best in us,” said she didn’t leak internal documents to a newspaper and then come before Congress in order to destroy the company or call for its breakup, as many consumer advocates and lawmakers of both parties have called for.

Haugen is a 37-year-old data expert from Iowa with a degree in computer engineering and a master’s degree in business from Harvard. Prior to being recruited by Facebook, she worked for 15 years at tech companies including Google, Pinterest and Yelp.

“Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.”

“Congressional action is needed,” she said. “They won’t solve this crisis without your help.”

In a note to Facebook employees Tuesday, Zuckerberg disputed Haugen’s portrayal of the company as one that puts profit over the well-being of its users, or that pushes divisive content.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” Zuckerberg wrote.

He did, however, appear to agree with Haugen on the need for updated internet regulations, saying that would relieve private companies from having to make decisions on social issues on their own.

“We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress,” Zuckerberg wrote.

Democrats and Republicans have shown a rare unity around the revelations of Facebook’s handling of potential risks to teens from Instagram, and bipartisan bills have proliferated to address social media and data-privacy problems. But getting legislation through Congress is a heavy slog. The Federal Trade Commission has taken a stricter stance toward Facebook and other tech giants in recent years.

“Whenever you have Republicans and Democrats on the same page, you’re probably more likely to see something,” said Gautam Hans, a technology law and free speech expert at Vanderbilt University

Haugen suggested, for example, that the minimum age for Facebook’s popular Instagram photo-sharing platform could be increased from the current 13 to 16 or 18.

She also acknowledged the limitations of possible remedies. Facebook, like other social media companies, uses algorithms to rank and recommend content to users’ news feeds. When the ranking is based on engagement — likes, shares and comments — as it is now with Facebook, users can be vulnerable to manipulation and misinformation. Haugen would prefer the ranking to be chronological. But, she testified, “People will choose the more addictive option even if it is leading their daughters to eating disorders.”

Haugen said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.

Despite the enmity that the new algorithms were feeding, she said Facebook found that they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate the vast majority of its revenue.

Haugen said she believed Facebook didn’t set out to build a destructive platform. “I have a huge amount of empathy for Facebook,” she said. “These are really hard questions, and I think they feel a little trapped and isolated.”

But “in the end, the buck stops with Mark,” Haugen said, referring to Zuckerberg, who controls more than 50% of Facebook’s voting shares. “There is no one currently holding Mark accountable but himself.”

Haugen said she believed that Zuckerberg was familiar with some of the internal research showing concerns for potential negative impacts of Instagram.

The subcommittee is examining Facebook’s use of information its own researchers compiled about Instagram. Those findings could indicate potential harm for some of its young users, especially girls, although Facebook publicly downplayed possible negative impacts. For some of the teens devoted to Facebook’s popular photo-sharing platform, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed.

One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.

She also has filed complaints with federal authorities alleging that Facebook’s own research shows that it amplifies hate, misinformation and political unrest, but that the company hides what it knows.

After recent reports in The Wall Street Journal based on documents she leaked to the newspaper raised a public outcry, Haugen revealed her identity in a CBS “60 Minutes” interview aired Sunday night.

As the public relations debacle over the Instagram research grew last week, Facebook put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12.

Haugen said that Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump in last year's presidential election, alleging that doing so contributed to the deadly Jan. 6 assault on the U.S. Capitol.

After the November election, Facebook dissolved the civic integrity unit where Haugen had been working. That was the moment, she said, when she realized that “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Haugen says she told Facebook executives when they recruited her that she wanted to work in an area of the company that fights misinformation, because she had lost a friend to online conspiracy theories.

Facebook maintains that Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarization.

“Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with (top) executives – and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about," the company said in a statement.

__

Associated Press writers Matt O’Brien in Providence, Rhode Island, and Amanda Seitz in Columbus, Ohio, contributed to this report.

___

Follow Marcy Gordon at https://twitter.com/mgordonap.


Recommended Videos