Reframing QAnon: Why Media Literacy is the Answer to Political Misinformation, not Censorship
Brady Tavernier
In early September, the Instagram account @soyouwanttotalkabout posted an infographic unpacking homegrown terrorism and calling for users to demand their leaders condemn white supremacy. The infographic credits the internet for breeding radicalization, citing Twitter among the various platforms responsible for the exchange of misinformation. This emphasis on corporate responsibility reflects a current conversation about whether social media companies should censor political misinformation, especially considering the recent rise of QAnon. However, following the disinformation crisis arising from the results of the 2020 Presidential election, explicit media literacy intervention must become a focal point of the conversation surrounding QAnon and online radicalization. Although some scholars argue that censorship is necessary to protect public safety, incorporating comprehensive media literacy tools on social media platforms can target the root of the problem. By doing so, social media companies can actively promote media literacy in a twenty-first century internet landscape that demands it. To be clear, I do not intend to unilaterally solve online political misinformation and radicalization; rather, I intend to reframe the conversation on censorship to encourage productive conversations and motivate further scholarship. I will focus on acknowledging why some scholars believe that online censorship is necessary to promote public safety, explaining why censorship is ineffective, and defending why I believe that media literacy intervention is the most effective strategy to meet the demands of the crisis.
First and foremost, an understanding of what QAnon represents is necessary in order to engage with this discourse. By defining the scope of the issue, scholars are forced to address the crisis as a unique problem with unique circumstances. Mike Wendling of BBC News reports that QAnon, an “unfounded conspiracy theory,” ultimately believes that “President Trump is waging a secret war against elite Satan-worshipping paedophiles in government, business and the media.” Endgame: “a day of reckoning where prominent people such as former presidential candidate Hillary Clinton will be arrested and executed” (Wendling). In other words, everyone is lying to us and the only beacon of truth is a seventy-four year old former reality TV star.
Considering that this theory presents public safety concerns, existing scholarship and commentary has focused largely on whether social media companies should engage in censorship practices. Reason being: dangerous speech should not be dismissed or protected on internet platforms. For example, Elise Thomas, strategist for the Australian Strategic Policy Institute, warns that conspiracies related to Covid-19 have “propelled QAnon to new heights,” contributing to a series of “violent incidents in the real world.” Thomas cites a specific incident in April, when a QAnon believer was arrested with 18 knives in her car near the USS Intrepid. Since QAnon is also a public safety issue, as censorship advocates would underscore, deceleration strategies are necessary. Considering that Thomas views QAnon as “directly fueled by its access to mainstream audiences on Twitter and Facebook,” the logical conclusion that censorship can be that decelerator is understandable. If the rise of QAnon can be connected to the rise of QAnon on mainstream platforms, as Thomas suggests, then censorship could possibly be effective.
Further, Richard Rogers, a professor in Media Studies at the University of Amsterdam, notes that deplatforming “demonstrates a shift in what is considered acceptable on social media.” In other words, while it’s arguable that cancelling these extremist hate groups only fortifies their convictions, a clear message is still communicated: your hate and misinformation have no place in these spaces. Moreover, in a comment reported in The Washington Post, University of Washington professor Kate Starbird notes that “the harmful and misleading content will be harder to find, making it tougher to recruit new members” (Lerman and Dwoskin). This pragmatic strategy of harm reduction admits that while the free exchange of misinformation can’t be fully eradicated, censorship by the mainstream media may prevent mainstream recruitment. Under this framework, social media companies must harbor responsibility to prevent popular acceptance of dangerous speech; for many scholars, this responsibility entails censorship.
Nevertheless, Rogers contends that media attention and recruitment increase when deplatformed spaces are forced to “migrate to alternative platforms.” Rogers’ concern is substantiated by the emergence of apps like Parler. Following the contested 2020 Presidential election, “Parler shot to the top of Apple’s App Store in downloads” after claiming to provide a space for dialogue without the risk of censorship by Big Tech (Isaac and Browning). Rogers’ concern about alternative platforms, coupled with the emergence of the Parler app, highlights a question critics of censorship ask: when you draw attention to these ideological groups, are you only emboldening their dissent from the mainstream? This argument suggests that systematically censoring something so personal – speech – may only add fuel to the fire.
Therefore, QAnon demands an approach that doesn’t rely on censorship. This focus on “censor or not censor” is inhibiting comprehensive solutions and undermining effective problem solving. The problem is not just that users are consuming misinformation — it is also that consumers aren’t equipped with the tools to distinguish misinformation from real news. Examining this through the provided framework of censorship will not allow social media companies to attack the root of the issue of media illiteracy. Instead of hoping that consumers will just figure it out and placing a band-aid over existing public safety threats, social media companies must acknowledge their responsibility to provide accessible media literacy tools on their platforms.
First, defining media literacy is necessary. In a Rand Corporation initiative to promote media literacy, researchers define the broad field as a collection of “specific competencies, such as the abilities to access, analyze, evaluate, and communicate media messages in a variety of forms” (Huguet et al. x). By engaging with media literacy skills, consumers can gauge the credibility of sources, place sources in the context of ongoing conversations and reporting, and come to informed conclusions.
Importantly, the pivotal role of the 2020 election in informing how we engage with media literacy cannot be overstated. Marjorie Taylor Greene, a Representative-Elect from Georgia – and the first public QAnon supporter to enter the halls of Congress – is on record expressing her excitement for “‘a once-in-a-lifetime opportunity to take this global cabal of Satan-worshipping pedophiles out’” (Salcedo). This excerpt from The Washington Post substantiates the concern that QAnon’s popularity is beginning to dominate the political conversation. In my view, if those with power and a platform are perpetuating misinformation, the ability of the people to responsibly consume and produce political information is compromised. A concentrated focus on media literacy feels overwhelming and complex; but by honing in on a sustainable strategy, we challenge any approach that simply hides the problem.
Fortunately, academics have provided scholarship drawing connections between a lack of media literacy and conspiracy theory endorsement. Stephanie Craft, along with two other scholars, conducted a study indicating that higher media literacy skills can mitigate the effectiveness of misinformation. Pursuant to this research, the lack of media literacy skills simulates a petri dish for conspiracy theories in online spaces. Applying this research to corporate responsibility among social media platforms legitimizes an overhaul of how information on the internet operates. Put another way, if higher media literacy skills can reduce conspiracy theory endorsement, social media companies have a responsibility to equip their users with the tools necessary to responsibly navigate political information.
Further, Andrew M. Guess and six other researchers released a study suggesting that exposure to media literacy intervention can be useful in promoting the discernment of real and false news. Using data from pre-registered survey experiments regarding electoral politics in the United States and India, they studied “the effectiveness of an intervention modeled closely on the world’s largest media literacy campaign, which provided ‘tips’ on how to spot false news” (Guess et al.). The results: an increase in discernment between misinformation and real news by 26.5% among a nationally representative sample of the United States (Guess et al.). Importantly, the tips provided by the researchers should be noted. For example, the researchers describe that “one sample tip recommends that respondents ‘be skeptical of headlines,’ warning that, ‘If shocking claims in the headline sound unbelievable, they probably are’” (Guess et al.). Considering the evidence indicating that digital media literacy intervention strategies work, particularly within electoral politics, I recommend that social media companies integrate similar tips within their platforms that provide users with tools to assess, evaluate, and discern online content. Regarding QAnon, such measures can help consumers meet the demands of the twenty-first century digital landscape. If the evidence suggests that there are tools outside of censoring the free exchange of information – while also uprooting the fundamental issue of media illiteracy – social media companies should follow that evidence.
Admittedly, research provided by R. Kelly Garrett, a professor at Ohio State University’s School of Communication, undermines this narrative by providing data that suggests “that social media [has] a small but significant influence on Americans’ belief accuracy.” While the role of social media is important, Garrett’s research suggests that it is not necessarily determinative. Therefore, why would it be necessary for social media companies to invest in comprehensive media literacy initiatives? Nevertheless, I still maintain that social media companies can play an important role in combating misinformation. First, Garrett’s work specifically focuses on the 2012 and 2016 election, using “three-wave panel surveys conducted with representative samples of Americans” to assess whether the “use of social media for political information promoted endorsement of falsehoods about major party candidates or important campaign issues.” This study asks: Would frequent users of social media be more likely to believe, in say, the false conspiracy that Barack Obama wasn’t born in the United States? However, it is important to note that the internet landscape has changed dramatically — especially considering the recent development of QAnon, which did not enter political discourse until after the 2016 election (LaFrance). Moreover, as mentioned previously, I do not intend to argue that social media companies are the sole answer to combat political misinformation and radicalization. Rather, I believe that social media companies can play a pivotal role by focusing on promoting media literacy instead of relying on censorship.
Despite my focus on corporate responsibility, social media companies are not the enemy. Similar to an employer and employee relationship, both the company and the consumers must engage collaboratively in order to produce results. Stacey Goodman, a high school educator focusing on media and filmmaking, puts it perfectly: “The situation is no longer us, the passive media consumers, versus them, the corporate and government media powers.” Instead, Goodman argues, “When it comes to perpetuating harmful media messages, the enemy is often us.” This distinction is important, because as Goodman notes, this is both a consumption and production issue. In other words, social media companies are not the ones posting far-right extremist information, therefore their responsibility is not to unilaterally delete misinformation. Rather, I believe that it is a social media company’s responsibility to help consumers – the perpetrators of misinformation and radicalization – challenge how they process and produce online content. By simply embedding a tool that reminds users that ‘headlines aren’t the full story’ when they try sharing an article before reading the content, users are challenged to question whether they critically evaluated that source. Even providing a few accessible tips on how to gauge source credibility could go a long way without inconveniencing users or alienating them through censorship strategies.
However, my argument doesn’t bear in mind an important truth that Charlie Warzel, an Opinion writer for The New York Times, articulates: “You don’t have a movement like this [QAnon] without people who’ve lost trust in expertise and authority and institutions” (Bokat-Lindell). Therefore, how can intervention by social media companies be effective if those who need it the most are prone to distrust that intervention? Further scholarship, particularly on political psychology, is needed to understand how to navigate mass distrust in the press. Nevertheless, I maintain that social media companies can play an important role in mitigating the consequences of poor media literacy skills without censoring political discourse. To throw in the towel would be to communicate that American democracy is expendable when the challenge seems overwhelming.
The 2020 election has sent the American people a clear message: if we continue neglecting to engage with media literacy skills while consuming online political content, we risk further jeopardizing the democracy we constantly struggle to materialize. While more must be done to mitigate the dangerous implications of political misinformation, social media companies can play a necessary role in preserving democratic principles and public safety without compromising the free exchange of information. By reframing the conversation outside the scope of mere censorship, social media companies and scholars can reevaluate the role of corporate responsibility in preventing the effects of online political misinformation and radicalization.
Works Cited
Bokat-Lindell, Spencer. “We Are Not Going to Fact-Check Our Way Out of QAnon.” The New York Times, 3 Sept. 2020, www.nytimes.com/2020/09/03/opinion/qanon-facebook-trump.html?searchResultPosition=30.
Craft, Stephanie, et al. “News Media Literacy and Conspiracy Theory Endorsement.” Communication and the Public, vol. 2, no. 4, SAGE Publications, Dec. 2017, pp. 388–401, doi:10.1177/2057047317725539.
Garrett, R. Kelly. “Social Media’s Contribution to Political Misperceptions in U.S. Presidential Elections.” PloS One, vol. 14, no. 3, Public Library of Science (PLoS), Mar. 2019, pp. e0213500–e0213500, doi:10.1371/journal.pone.0213500.
Goodman, Stacey. “Social Media Literacy: The 5 Key Concepts.” Edutopia, George Lucas Educational Foundation, 30 May 2014, www.edutopia.org/blog/social-media-five-key-concepts-stacey-goodman.
Guess, Andrew M., et al. “A Digital Media Literacy Intervention Increases Discernment between Mainstream and False News in the United States and India.” Proceedings of the National Academy of Sciences, vol. 117, no. 27, 2020, pp. 15536–15545., doi:10.1073/pnas.1920498117.
Huguet, Alice, et al. “Media Literacy Education as a Tool for Mitigating ‘Truth Decay’.” RAND Corporation, 11 July 2019, www.rand.org/pubs/research_reports/RR3050.html.
Isaac, Mike, and Kellen Browning. “Fact-Checked on Facebook and Twitter, Conservatives Switch Their Apps.” The New York Times, The New York Times, 11 Nov. 2020, www.nytimes.com/2020/11/11/technology/parler-rumble-newsmax.html.
LaFrance, Adrienne. “The Prophecies of Q.” The Atlantic, Atlantic Media Company, 24 Sept. 2020, www.theatlantic.com/magazine/archive/2020/06/qanon-nothing-can-stop-what-is-coming/610567/?gclid=Cj0KCQiAwMP9BRCzARIsAPWTJ_FiobiuMv5sMTpIhIBIxtRFy0ZLVO6cgag-aSlpuPQnshZy0bcuLI8aAo0FEALw_wcB.
Lerman, Rachel, and Elizabeth Dwoskin. “Twitter Crackdown on Conspiracy Theories Could Set Agenda for Other Social Media.” The Washington Post, WP Company, 23 Jul. 2020, www.washingtonpost.com/technology/2020/07/22/twitter-bans-qanon-accounts/.
Rogers, R. “Deplatforming: Following Extreme Internet Celebrities to Telegram and Alternative Social Media.” European Journal of Communication (London), vol. 35, no. 3, SAGE Publications, June 2020, pp. 213–29, doi:10.1177/0267323120922066.
Salcedo, Andrea. “Marjorie Taylor Greene, Who Backs QAnon and Has Made Racist Remarks, Wins Congressional Seat.” The Washington Post, WP Company, 4 Nov. 2020, www.washingtonpost.com/nation/2020/11/04/marjorie-greene-qanon-georgia-congress-election/.
@soyouwanttotalkabout. “So You Want to Talk About Homegrown Terrorism.” Instagram, 8 Sept. 2020, https://www.instagram.com/p/CE4BywFH3y1/?utm_source=ig_web_copy_link
Thomas, Elise. “Twitter Moves Against QAnon Conspiracy Theorists.” The ASPI Strategist [BLOG], Newstex, 2020. http://proxyau.wrlc.org/login?url=https://www-proquest-com.proxyau.wrlc.org/docview/2426044014?accountid=8285.
Wendling, Mike. “QAnon: What Is It and Where Did It Come from?” BBC News, BBC, 20 Aug. 2020, www.bbc.com/news/53498434.