INCENDIARY CONTENT WORKS, AND FACEBOOK KNOWS IT

When Mark Zuckerberg created Facebook as a sophomore at Harvard in 2004, no one imagined that the platform would grow into an entity that could inspire communal riots, threaten democracy, and destroy teenagers’ mental health. When people think of social media, they think of Facebook- and that is not surprising, considering some of the biggest platforms, like WhatsApp and Instagram are owned by Facebook. No one expected a social media platform to have such intense impacts on society- and yet, as multiple investigations into the company show, it does have an enormous impact- and it knows it too.

In September 2021, the Wall Street Journal released the infamous Facebook Files.The Files, based on internal documents, whistleblowers, and research reports, painted a picture of a company that knew of its failures- and did little to change that.

One of the most shocking revelations was that Facebook was aware of the harm it caused teen girls- and it did little to change that. In fact, Facebook had to halt the development of Instagram Kids, following the backlash. According to a March 2020 slide presentation posted to Facebook’s internal message board, “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” Facebook has repeatedly conducted studies into the harmful effects of its app, and has found damning evidence. It found that 13% of British, and 6% of American users that reported suicidal thoughts traced them back to Instagram. Another presentation from 2019 stated, “We make body image issues worse for one in three teen girls.”

So, if social media like Facebook and Instagram are so harmful, why don’t we make children stop using them? Why don’t Facebook and Instagram themselves shut these dangerous messages down?

As Frances Haugen, Facebook whistleblower puts it, Facebook, when encountering a conflict between profit and user safety, repeatedly chose profits. Teens and younger people have been moving away from Facebook for a long time, with only 5 million teens logging onto Facebook every day. On Instagram, 22 million teens log on each day. Retaining these users is essential to maintain Facebook’s 100 billion dollar annual revenue.

Facebook has also rejected attempts to restrict the disproportionate amplification of inflammatory posts because it might hamper its growth. These attempts that Facebook rejected could have prevented its services being used spread religious hatred in India. In December 2019 when protests based on religion were sweeping the country, inflammatory content on the platform rose by 300%. In late February 2020, these calls to violence were spreading even more on WhatsApp.

A review of Facebook’s documents show that the company also had a weak response to the fact that the site was being used by human traffickers, drug cartels, and the inciting of violence against ethnic minorities in Ethiopia. Facebook’s main strength lies in the fact that it makes its users keep coming back. Through the use of notifications and strategic algorithms, it shows people posts that are designed to grab their attention- even if it makes the user angry or unhappy. This is why incendiary content does so well- and why Facebook is reluctant to change that.

Social media, in less than a couple of decades, has grown into a force so powerful, it can cause harm to human life. Its regulation is essential to protect the people, and children, that use it.

10 Jan 2022
Prishita Das