Facebook Whistleblower Exposes Need For Change Within Company

The Facebook log-in screen.

Sydney Martin

The Facebook log-in screen.

Sydney Martin, Staff Writer

The Facebook log-in screen. (Sydney Martin)

It’s no secret that social media has had a major impact on the way we live our lives. The company Facebook, now rebranded as Meta, is responsible for some of the most high-traffic sites on the internet: Instagram and Facebook. However, they have drawn their fair share of criticism. With the high traffic these sites experience, the company is now facing issues such as hate speech, which has negative effects on young people. This high volume of hate speech only doubled when an anonymous employee filed complaints with federal law enforcement in September.

Her identity would later be revealed in a 60 Minutes interview as Frances Haugen. A former data scientist, Haguen disclosed thousands of pages of internal documents to the U.S Securities and Exchange Commission, documents that would later become the basis of The Wall Street Journal’s Facebook Files. 

The issues exposed ranged from rampant hate speech on the platform to the full extent of effects that Instagram has on teenagers mental health. In the 60 Minutes interview, Haugen shared that “Facebook’s own research says it is not just [that] Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.” 

According to documents posted within the company and made public in October, nearly 66% of teenage girls on Instagram said they experienced “negative social comparison” and 32% of girls said that when they felt bad about their bodies, Instagram made the problem worse. 

With the pages and data in them being made public, it became clear that Facebook  knew more about the issues on their platform than they let on. This drew many to argue that they had repeatedly chosen user engagement and traffic over safety. 

In addition to the negative mental health impact that platforms had, there was also rampat misinformation and hate speech among the users. According to Haugen, while there were measures taken to reduce the misinformation during the 2020 election, they were later removed from the platform after the race had ended. 

The angry posts often incited strong feelings in users, causing an increase in interactions, and thus, more time spent on the platform, which the algorithm is designed to push. 

All of this has led Facebook to reexamine their internal policies and algorithms for their platforms. Having already postponed plans to create an Instagram app geared towards children under 13, the company is now facing legal action from Haugen and her lawyers. 

Despite this, Haugen said in multiple interviews that she doesn’t want to harm the company. In an interview with Time Magazine, she said that she often saw employee teams trying to tackle issues that were too large for how small the teams were. 

Not only were these teams responsible for dealing with misinformation on Facebook, but they were also often responsible for dealing with international users, who, in some cases, were experiencing access to social media for the first time. 

In the same interview, Haguen said that she was now turning to what comes next. Based on the harm social media companies cause, she wants to start a grassroots movement to help young people combat them. She believes that young people leading other young people will have a much larger impact than adults trying to do the same. 

It’s her hope that the actions she’s taken will put more of a spotlight on the need for transparency within social media platforms, as well as working to minimize the priority placed on user engagement over user safety and honest facts.