Facebook papers reveal social media concerns

Senior+Victoria+Tong+logs+in+to+Facebook+on+her+cellphone.

Samantha Wu

Senior Victoria Tong logs in to Facebook on her cellphone.

On Oct. 22, data scientist and former Facebook employee Frances Haugen released thousands of internal documents and investigative research from her former company to Congress. These documents included investigations on Facebook’s role and complicity in the Jan. 6 insurrection, lack of hate speech censorship in underdeveloped countries, misinformation among vulnerable demographics and mental health problems in teens.

The papers showed discrepancies between public statements from Mark Zuckerberg, the CEO and creator of Facebook, and real data found in internal investigations. For example, when Zuckerberg testified in front of Congress in 2019, he said that Facebook’s algorithm removes 94 percent of hate speech and misinformation from the app before it is reported by a human. But Haugen’s documents reveal the algorithm to be removing less than five percent. 

One of Haugen’s main claims was that the “profit optimizing [Facebook algorithm] is generating self-harm and self-hate — especially for vulnerable groups, like teenage girls.”  The design of the program had been called out for impacting youth body image and self esteem. “[On Facebook or Instagram] you may see pictures of people who are like models or influencers who have a bunch of friends or are wealthy and you may like feel bad about yourself or feel left out,” freshman Libby Cooke said.

Richard Montgomery students are not just worried about the mental health side of the issue. “Facebook has a bad reputation kind of from a security perspective,” sophomore Ian Polansky said. “There’s been a lot of things [in the news] about them stealing your information.” Students are concerned about what the company is doing with the data they collect, and whether the users have a choice in the matter.

When asked about Facebook’s reputation, sophomore Victoria Dziasek said, “It has a reputation as being for old people, but I don’t know if that’s necessarily bad.” Compared to developing countries, the platform shown to Americans is the most polished and protected version available. 84% of Facebook’s work against misinformation was for posts originating in the United States. The remaining 16% is unequally distributed throughout the rest of the world, including developed countries like France and India. 

A majority of RM teens use both Instagram and Whatsapp, two of Facebook’s most used platforms. “I communicate with my family via WhatsApp and I communicate with my friends via Instagram,” Dziasek said. But very few use Facebook directly, for a variety of reasons. “I’ve never really considered using it and none of my friends are on it,” sophomore Kyle Baer said. 

Facebook has a set of guidelines that are applied to all posts and messages made on the platform. However, there is another layer of the system, Crosscheck, which is exclusively applied to celebrities and high-profile users. Those users have very minimal limitations on what they can post, essentially allowing millions of VIP users free range of the platform, regardless of whether or not they are spreading misinformation, hate speech or inciting violence. Only under special circumstances will Facebook interfere with dangerous content, such as the Jan. 6 insurrection.

Haugen left Facebook because of what she ruled to be dangerous policies and perceived lack of accountability within the company. The unveiling of these new developments has sparked discussion and criticism both on a global scale and within the community at RM.