What Allowed Facebook To Run Rampant?
The Problems
Last October 5, Frances Haugen testified before the American Congress, accusing Facebook of prioritising profit over safety. According to the company’s internal documents, Facebook knew that Instagram harms teenage girls’ mental health, its change in algorithm promotes divisiveness and its failure to sufficiently regulate foreign content contributed to atrocities including inciting ethnic violence in Ethiopia.
Facebook undermining integrity and public safety isn’t a new issue. In 2018, Christopher Wylie testified before the Senate, proving that Facebook harvested the data of millions of its users without their consent, which Cambridge Analytica used for political advertising. Consequently, the integrity of the 2016 US presidential elections was undermined. Last year, Sophie Zhang uncovered Facebook undermining the integrity of numerous political elections outside the West by allowing “coordinated inauthentic behaviour,” such as mass likes, to prejudice voters. For instance, Zhang discovered that the president of Honduras, Juan Orlando Hernández, received 59,100 likes on his posts over a period of six weeks, more than 78% of which originated from fake accounts (Wong, 2021). Consequently, this distorted the public’s perception of how popular Hernández was as a candidate and altered the algorithm to place Hernández’s content at the top of Hondurans’ news feeds. These instances reinforce the urgency of Haugen’s warning: Facebook threatens democracy.
Lack of internal controls
Facebook claims that it has employed 40,000 people and spent over $13 billion to improve the safety and security of its platforms (Wagner, 2021). However, Facebook employees have shown that these efforts aren’t enough to protect users. Zhang notified threat intelligence, the team responsible for dismantling “coordinated inauthentic behaviour,” regarding the situation in Honduras. However, the team responded by saying that they lacked the resources and external pressure to handle the situation. Furthermore, in her testimony to the European Parliament, Haugen stated that Facebook lacked the internal controls to address its platforms’ flaws and the issues resulting from these. Moreover, in connection with the leaked internal documents, Haugen stated that Facebook lacked staff who understood languages used in developing countries. For instance, Facebook was unable to prevent inciting ethnic violence in Ethiopia since its systems could not regulate content contributing to this due to the variety of foreign languages used. Hence, despite Facebook’s efforts, the issues outlined above and statements from former employees demonstrate the need for Facebook to do much more to improve the safety and security of its platforms.
Lack of effective regulation
Considering the ubiquity of Facebook and its services, and our heavy dependence on them, the lack of proper regulation is baffling. In America, s.230 of the Communications Decency Act 1996 prevents social media companies from being held liable for their decisions concerning algorithms. Consequently, companies may alter their algorithms to produce harmful results. For instance, Facebook’s change in algorithm promoted divisiveness, misinformation and hate. This is demonstrated by Trump supporters organising an attack against the Capitol on Facebook shortly after Biden was declared president, which eventually materialised, and was motivated by false claims of election fraud circulating the platform.
By: Maiya Dario