Facebook whistleblower revealed on s’60 Minutess’ says the company prioritized profit over public goods

Facebook whistleblower revealed on s’60 Minutess’ says the company prioritized profit over public goods

Facebook whistleblower revealed on ’60 Minutes‘ says the company prioritized profit over public good

 

 

The identity of the Facebook whistleblower who released tens of thousands of pages of internal research and documents — leading to a firestorm for the social media company in recent weeks — was revealed on “60 Minutes” Sunday night as Frances Haugen.

The 37-year-old former Facebook product manager who worked on civic integrity issues at the company says the documents show that Facebook knows its platforms are used to spread hate, violence and misinformation, and that the company has tried to hide that evidence.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money,” Haugen told “60 Minutes.”
“60 Minutes” correspondent Scott Pelly quoted one internal Facebook (FB) document as saying: “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.”

About a month ago, Haugen filed at least eight complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public. She also shared the documents with the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its apps, including the negative effects of misinformation and the harm caused, especially to young girls, by Instagram.
Haugen, who started at Facebook in 2019 after previously working for other tech giants like Google (GOOGL GOOGLE) and Pinterest (PINS), is set to testify on Tuesday before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security.

“I’ve seen a bunch of social networks, and it was substantially worse at Facebook than anything I’ve seen before,” Haugen said. “At some point in 2021, I realized I’m going to have to do this in a systemic way, that I’m going to have to get out enough [documents] that no one can question that this is real.”
Facebook has aggressively pushed back against the reports, calling many of the claims “misleading” and arguing that its apps do more good than harm.
“Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” Facebook spokesperson Lena Pietsch said in a statement to CNN Business immediately following the “60 Minutes” interview. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
Several hours after the interview aired, Pietsch released a more than 700-word statement laying out what it called “missing facts” from the segment, and saying the interview “used select company materials to tell a misleading story about the research we do to improve our products.”
A spokesperson for “60 Minutes” did not immediately respond to a request for comment from CNN Business on Facebook’s claims.
On Sunday morning ahead of the “60 Minutes” interview, Facebook Vice President of Global Affairs Nick Clegg told CNN’s Brian Stelter that “there is no perfection on social media as much as in any other walk of life.”
“We do a huge amount of research, we share it with external researchers as much as we can, but do remember there is … a world of difference between doing a peer-reviewed exercise in cooperation with other academics and preparing papers internally to provoke and inform internal discussion,” Clegg said.
Haugen said she believes Facebook Founder and CEO Mark Zuckerberg “never set out to make a hateful platform, but he has allowed choices to be made where the side effects of those choices are that hateful and polarizing content gets more distribution and more reach.”

Facebook whistleblower revealed on '60 Minutes,' says the company prioritized profit over public good
Facebook whistleblower revealed on ’60 Minutes,’ says the company prioritized profit over public good

Whistleblower revealed

Haugen said she was recruited by Facebook in 2019 and took the job to work on addressing misinformation. But after the company decided to dissolve its civic integrity team shortly after the 2020 Presidential Election, her feelings about the company started to change.
She suggested that this decision — and moves by the company to turn off other election protection measures such as misinformation prevention tools — allowed the platform to be used to help organize the January 6 riot on Capitol Hill.
“They basically said, ‘Oh good, we made it through the election, there weren’t riots, we can get rid of civic integrity now,'” she said. “Fast forward a couple of months, and we had the Insurrection. When they got rid of civic integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.'”
Facebook says the civic integrity team’s work was distributed to other units when it was dissolved. Facebook Vice President of Integrity Guy Rosen said on Twitter Sunday night that the group was integrated into other teams so the “work pioneered for elections could be applied even further.”
The social media company’s algorithm that’s designed to show users content that they’re most likely to engage with is responsible for many of its problems, Haugen said.
“One of the consequences of how Facebook is picking out that content today is that it is optimizing for content that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” she said. She added that the company recognizes that “if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”
Facebook’s Pietsch said in her Sunday night statement that the platform depends on “being used in ways that bring people closer together” to attract advertisers, adding, “protecting our community is more important than maximizing our profits.”
In an internal memo obtained by the New York Times earlier Sunday, Clegg disputed claims that Facebook contributed to the January 6 riot.
“Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out,” Clegg said in the memo. “So it’s natural for people to ask whether it is part of the problem. But the idea that Facebook is the chief cause of polarization isn’t supported by the facts.”
Haugen said that while “no one at Facebook is malevolent … the incentives are misaligned.”
“Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction,” she said. “And the more anger that they get exposed to, the more they interact and the more they consume.”

Published by

Leave a Reply

Your email address will not be published. Required fields are marked *

×
×