Download PDF

Big Tech is Threatening our Democratic Systems 

Sachi Carlyn Lozano 


 I still remember the sinking feeling in my stomach, the genuine fear, and the hurt I felt as I eagerly refreshed the live voting count update. With nearly half the votes in, it was clear: Ferdinand “Bongbong” Marcos Jr, son of the former dictator, was going to be the 17th president of the Philippines. I still remember being at my voting site, drenched in sweat, my pink tank top now slightly red. Just two hours after the polls had closed, my family and I watched every vote come in, cementing our new reality.   

We had been warned about it. We had heard of different groups trying to spread misinformation, but we were convinced. Maria Leonor “Leni” Robredo, former vice president, was going to win. We saw how the sea of pink—Leni’s campaign color—swept the nation. How a leader who was compassionate, transparent, and selfless, brought the country together and reignited the love of so many—myself included—for our nation. We underestimated the warnings. Turns out, they were more serious than we realized.  

With roughly 91 million or 83% of the Philippine population actively on Facebook, it has become the largest information disseminating platform in the nation (World Population Review). Given that algorithms are not held to the journalistic standards that are essential for distributing news, the current regime was able to successfully abuse the lack of such safeguards to fuel their propaganda and historical revisionism. Reporters from the New York Times discuss how the Marcoses sought to target young voters with no memory of the past. At a campaign rally, they note how a 15-year-old girl stated “that she got most of her news about Mr. Marcos from TikTok and Facebook, and that she did not believe much in books” (Wee and Elemia).  

These powerful accounts explain how the dictator’s family found themselves back in power. The battle we, the opposition, had lost was much larger than we perceived it to be. It was unlike traditional electoral fraud; it was a complex and calculated plan that aimed to capitalize on mass media to propagate a campaign they built on lies.  

Journalist Rebecca Mackinnon once questioned how sovereigns of cyberspace would yield their power: “Would they censor and restrict freedoms to serve advertisers or governments with whom they were trying to curry favor?” (Tufekci 66). Well, in the Philippines, a misinformed electorate, as a result of proliferating false narratives, is greatly responsible for the return of a dynasty whose former dictatorship is characterized by plunder and extrajudicial killings.  

While I personally have made peace with the elaborate manipulation of the other side, my growing fear of the power of big tech remains. What do cases like the Philippine elections tell us about the impact that technological developments have on political discourse? What does it mean for democracy if algorithmically curated media is what shapes our understanding?  

These curiosities and fears are reminiscent of the ones that Zeynep Tufekci raised in 2016. In fact, the 2022 Philippine Elections highlight what she foresaw to be concerning issues. In “As the Pirates Become CEOs: the Closing of the Open Internet,” Tufekci warns against surveillance, censorship, and media monopolization. These recurring themes underscore the threat that technological advancement has not only on political spaces, but on issues such as personal body image and data privacy. These dangers relate to one another as they are all fueled by the same companies that monopolize the space with their profit-driven algorithm.   

The central issue is best articulated by Tufekci: “There are billions of people on the Internet, but a few services capture or shape most of their activities” (67). Our cyberspaces are dominated by platforms owned by the same companies. Further, these companies design algorithms that encourage users to linger on their sites. As grandfather of the Iranian blogosphere, Hossein Derakhshan, finds, these “algorithmic walled gardens,” strangle access to hyperlinks, effectively “directing users to content within their walls and regulating access to the outside Web in very specific ways” (Tufekci 67). Even when users do shift to other platforms, it is likely that the platform they move to is owned by the same group as companies are continually acquiring growing competitors. Facebook, for instance, currently owns both Instagram and WhatsApp, among other acquisitions. As such, consumer perception of the world is subject solely to the source they are receiving information from. This is only the beginning of many alarming consequences that will come with the monopolization of online media.  

In developing countries, the severity of this concern is particularly prominent. As reported by David Clark, Facebook has established several agreements with mobile service providers in developing countries that allow users to access Facebook free of data charge. In poorer areas like the Philippines, this would make Facebook the only accessible source of information. According to Clark, “many people are not even aware that there is an Internet outside of Facebook” (Tufekci 70). For example, only 23.4% of households in the Philippines have access to laptops (David 4). However, “the sharp decline in costs of smartphone units that are capable of Internet connection led to an expansion of internet access in the Philippines” (David 4). The reality that a substantial number of people make conclusions based on information solicited from one curated ecosystem is concerning because the algorithms that facilitate this transference of information are commercially and politically biased, without consideration for truth.   

Commercial and political bias as discussed above stems from the need of these platforms to ascertain their funding. Given that an engaged audience attracts advertisers—their main source of profit— “likes are the main currency in Facebook’s all-important algorithm” (Tufekci 67). Consequently, what one is shown online is determined by the content’s popularity. Thus, in these online realms, facts and fallacies are regarded as equal; how such information is distributed is not subject to its validity, rather the level of engagement it receives. The success of “troll farms” in the Philippine elections highlights the susceptibility of this system to abuse.   

Troll farms are known as cyber-armies hired to “[churn] out fake content, false narratives and anything else [a] client wants (Mahtani and Cabato). A study by the Washington Post unveils the intricacies of this manipulation machinery. Not only are workers tasked to produce content about their desired candidate, they also deliberately act like genuine users. As noted in the study, “one female worker, [pointed to] an open Twitter page showing the fake profile of a young, pink-cheeked woman” (Mahtani and Cabato). Given that the algorithm is like-based and not fact-based, these troll farms were able to control what dominated media landscapes. By engaging with content they wanted to promote, factual or not, they effectively fed audiences with narratives they were hired to propagate.  

Unsurprisingly, these profit-driven forces allowed for the rampant spread of lies. For instance, while I grew up hearing my grandma’s accounts of how former first lady, Imelda Marcos, lived lavishly off the people’s money, mass audiences were convinced that the unexplained wealth of the Marcoses came from the Tagean Tallano family who, according to false sources, ruled the Philippines prior to the Spanish colonial era. Such stories blatantly contradict our history, and the IBON Foundation has approximated the ill-gotten wealth of the Marcoses to be, P1.87 trillion ($36.4 billion) (Basul). This ‘Tallano Gold’ myth is only one of many falsified narratives that have been sold to unknowing audiences. Its rampant spread underscores the real threat Facebook and other platforms pose to facts.   

Nevertheless, it cannot be ignored that the facilitation of conversation through these algorithmic walled gardens have made widespread communication far more convenient even in the context of political conversations. For example, I noticed that the more I searched about my presidential candidate, Leni, the more frequently information and ‘news’ about her came up on my feed even without my searching about her. This kept me up to date on her campaign and work as former vice president. Such access was beneficial as the elections occurred during the time when COVID-19 was still very rampant in the Philippines. Additionally, as many groups and companies are coming to understand the considerable influence Facebook has on ongoing conversations, nearly all news sites and civic groups have their information accessible via Facebook. Thus, I and others can arguably access all the information we need through this app.   

Despite such benefits, the accessibility and convenience of Facebook still highlights the underlying dangers of this sophisticated platform. That I was seeing content related to a candidate I have been searching for is convenient, but also alarming. Clearly, my previous searches were being collected and stored by these systems to conclude what I should and should not be shown—again, to keep me on the platform longer, and thus, generate more revenue.  Further, algorithms do not account for disparities in the media literacy of users. This explains why I would see both factual information and false claims regarding Leni on my feed. Fortunately, I was able to deduce when content seemed falsified and cross check on other platforms. Unfortunately, not everyone develops high media literacy and has access to resources outside of Facebook.   

Such realities demonstrate how the issue of misinformation stems both from the irresponsibility of big tech, and the large educational inequalities among audiences. Although companies such as Facebook did not intend for their platforms to be news sources, the reality is that they have become as such. Thus, they can no longer operate as a regular social networking platform. Seeking funding outside of ads would be a step towards practicing more objectivity. In addition, last year, Facebook had $117.9 billion dollars in revenue. Thus, if the company wanted to address the issues that are emerging from their platform, I believe an effective way to do so would be to invest some of their revenue into educating users. They could donate to organizations working to promote media literacy, or perhaps hold their own workshops for lower-income audiences.   

Nevertheless, while I fully believe big tech must assume great responsibility in addressing this issue, I believe it is imperative that governments and families should also play an active role in regulating media consumption. Authorities should consider dedicating a national agency to oversee social media platforms and developing media literacy should be integrated into school curriculums. Finally, I believe critical thinking begins at home. In their scheme, the Marcoses targeted first time voters given their avid use of social media. Thus, I believe that parents should work alongside the government to help their children process the media they consume.   

For those, like myself, who belong to more privileged sectors, it is convenient to contend that big tech is too large and influential to reel in. Granted, we have other options. However, the Philippine elections demonstrate that the consequences of widespread misinformation effect everyone, regardless of whether you were personally fooled by false narratives. On May 9, 2022, many Filipinos casted their votes based on lies and misconceptions. In our democracies, we rely on facts to inform the decision voters make. If we can no longer do so, free and fair elections—a central feature of democracy—will never occur.   

Facebook’s refusal to better regulate the content they feed their users has resulted in the widespread disbursement of false and misleading narratives that effectively rewrote Philippine history. Consequently, it reinstated a dictatorial dynasty. This case magnifies the need for Facebook to account for their role in the proliferation of both misinformation and disinformation. Likewise, it underscores the importance of governments and families in ensuring smart and safe media consumption. Therefore, while there are clear advantages to the use of Facebook as a facilitator of discourse, authorities and audiences alike need to be aware of the risks and work harder to condemn platforms that undermine the integrity of our democratic processes.   


Works Cited  

Basul, Terence. “The Lies about the ‘Tallano Gold’ and the Truth about Ill-Gotten Wealth.” Bulatlat, 19 Feb 2022. https://www.bulatlat.com/2022/02/19/the-lies-about-the-tallano-gold-and-the-truth-about-ill-gotten-wealth/.  

David CC, San Pascual MRS, Torres MES. Reliance on Facebook for news and its influence on political engagement. PLOS ONE 14(3) 19 Mar 2019. https://doi.org/10.1371/journal.pone.0212263   

“Facebook Users by Country.” World Population Review. Retrieved 29 Sept 2022.  https://worldpopulationreview.com/country-rankings/facebook-users-by-country 

Mahtani, Shibani, and Regine Cabato. “Why Crafty Internet Trolls in the Philippines May Be Coming to a Website near You.” The Washington Post, WP Company, 29 July 2019, https://www.washingtonpost.com/world/asia_pacific/why-crafty-internet-trolls-in-the-philippines-may-be-coming-to-a-website-near-you/2019/07/25/c5d42ee2-5c53-11e9-98d4-844088d135f2_story.html.   

Tufekci, Zeynep. “As the Pirates Become CEOS: The Closing of the Open Internet.” Daedalus, vol. 145, no. 1, 2016, pp. 65–78., https://doi.org/10.1162/daed_a_00366.  

Wee, Sui-Lee and Elemia, Camille. “The Phlippines Toppled One Marcos. Now his Son May Become President.” The New York Times. 13 Apr 2022. https://www.nytimes.com/2022/04/13/world/asia/philippines-marcos-president-election.html