QAnon Research: May 8, 2023
Fox News loses its lead conspiracy theorist, and the QAnon Newsletter makes its proud return. Give us a primetime slot.
Welcome back to the QAnon Newsletter! The past few newsletters were skipped due in part to the QAnonference 2023, where we were thrilled to meet with the incredible researchers continuing to dive deep into the world of conspiracy theories. The rest of the delay is blamed on the busy travel schedules of the editors.
This month’s newsletter catches us up with the latest work of members of the QAnon Research Network, highlighted below. We include some additional noteworthy publications focusing on the German far-right, some new conspiracy theory datasets, and a few upcoming (and archived) events—you can meet us at ICA, if you’ll be in attendance! Next month’s newsletter is already in the works, with many excellent publications that wouldn’t be appropriate to squeeze in today.
During the hiatus, co-editor Peter visited NYC where he encountered QAnon’s infamous JFK Jr. impersonator among the conspiratorial crowd at Donald Trump’s arraignment. QAnon has stayed relevant in other ways in the past two months:
The Capitol rioter and “QAnon Shaman” was released early from prison,
Arizona sheriff and US Senate candidate Mark Lamb made appearances on QAnon-hosted web broadcasts,
Current Truth Social CEO Devin Nunes appeared on a QAnon hosted web-broadcast (yeah, same one),
Fellow Arizonan Liz Harris was expelled from the Arizona House of Representatives after inviting a conspiracy theorist to a hearing,
Will Sommer discussed the mainstreaming of QAnon with Al-Jazeera,
A town in the Netherlands provided an example of how QAnon’s lies might play out on the local level,
Kiera Bulter explored the success of the QAnon-linked #DiedSuddenly documentary,
And a K-pop star donned a QAnon t-shirt.
Much of the information posted here would be difficult to find without the contributions of our submitters, so we please ask that you submit to our Google form to keep everyone in the loop. We also encourage people to comment on posts on the website and to always feel free to reach out to us. To grow the community, please invite anyone who might be interested by sending out a link to this newsletter:
We look forward to your submissions. Take care!
“A God-Tier LARP? QAnon as Conspiracy Fictioning” in Social Media + Society [Link]
Daniël de Zeeuw and Alex Gekker
Abstract: The QAnon movement, which gained a lot of traction in recent years, defies categorization: is it a conspiracy theory, a new mythology, a social movement, a religious cult, or an alternate reality game? How did the posts of a (supposedly) anonymous government insider named Q on an obscure online imageboard in October 2017 instigate a serious conspiracy movement taking part in the storming of the US Capitol in early 2021? Returning to the origins of QAnon on 4chan’s Politically Incorrect board and its initial reception as a potential LARP, we analyze it as an instance of participatory online play that fosters deep engagement above all. Drawing on concepts from play and performance studies, we theorize the dynamics by which QAnon developed into an influential conspiracy narrative as instances of “conspiracy fictioning.” In particular, we revive the notion of hyperstition to make sense of how such conspiracy fictionings work to recursively “bootstrap” their own alternate realities into existence. By thus exploring the participatory and playful engagement mechanisms that drive today’s conspiracy movements, we aim to elucidate the epistemological and socio-political dynamics that mark the growing entanglement of play and politics, fact and fiction in society.
“‘No Reason[.] [I]t /Should/ Happen here’: Analyzing Flynn’s Retroactive Doublespeak During a QAnon Event” in Political Communication [Link]
Josephine Lukito, Jacob Gursky, Jordan Foley, Yunkang Yang, Katie Joseff, and Porsmita Borah
Abstract: As digitally organized, conspiratorial extremist groups gain more attention in the United States, researchers face increasing calls to better understand their in-group and out-group communication strategies. Using the QAnon conspiracy community as a case study, we use data from news coverage, social media, and ethnographic field work surrounding a prominent QAnon conference to analyze the uptake and aftermath of a controversial comment made by a public figure at the event. Our mixed methods analysis finds that QAnon’s efforts to use retroactive doublespeak produced mixed results, persuading some members to re-interpret the comment; however, there was a limit to its effectiveness. Our findings contribute to the literature on political extremism and digital media by elucidating how anti-publics within the QAnon movement reconstruct events and thread the rhetorical needle to reconcile contradictory messages. In particular, we highlight the factors that precede anti-publics’ use of retroactive doublespeak and discuss its use to negotiate the tension between in-group and out-group interpretations of events.
“‘We the People, Not the Sheeple’: QAnon and the Transnational Mobilisation of Millennialist Far-right Conspiracy Theories” in First Monday [Link]
Abstract: QAnon is a U.S.-based conspiracy theory that has been branded as far-right, yet it remains unclear which tenets of transnational far-right ideology are present within QAnon discourse and how adherents actively participate in the movement. To address this problem, a multi-phase content analysis on 1,000 tweets and 8kun posts explores the presence of transnational far-right tenets and millennialist themes within QAnon discourse. A posting frequency analysis of 37,782 tweets and 9,023 posts determines how QAnon adherents participate in precipitating a millennialist apocalypse, and how they can be disrupted. The results suggest that Australian QAnon communities integrate national themes and narratives to ground discourse in the Australian context, and communicate far-right tenets to identify a complex, interconnected left-wing deep state, that must be combatted through a coming apocalypse that QAnon adherents will participate in. This research develops new understandings of how far-right ideology can mould to fit different national contexts, can covertly manifest in discussion topics that are not explicitly far-right, and shows that millennialist movements can be both accelerated and disrupted using social media.
“The Efficacy of Interventions in Reducing Belief in Conspiracy Theories: A Systematic Review” in PLoS ONE [Link]
Cian O’Mahony, Maryanne Brassil, Gillian Murphy, and Conor Linehan
Abstract: Conspiracy beliefs have become a topic of increasing interest among behavioural researchers. While holding conspiracy beliefs has been associated with several detrimental social, personal, and health consequences, little research has been dedicated to systematically reviewing the methods that could reduce conspiracy beliefs. We conducted a systematic review to identify and assess interventions that have sought to counter conspiracy beliefs. Out of 25 studies (total N = 7179), we found that while the majority of interventions were ineffective in terms of changing conspiracy beliefs, several interventions were particularly effective. Interventions that fostered an analytical mindset or taught critical thinking skills were found to be the most effective in terms of changing conspiracy beliefs. Our findings are important as we develop future research to combat conspiracy beliefs.
“Exposure to Untrustworthy Websites in the 2020 US Election” in Nature Human Behavior [Link]
Ryan C. Moore, Ross Dahlke, and Jeffrey T. Hancock
Abstract: Research using large-scale data on individuals’ internet use has provided vital information about the scope and nature of exposure to misinformation online. However, most prior work relies on data collected during the 2016 US election. Here we examine exposure to untrustworthy websites during the 2020 US election, using over 7.5 million website visits from 1,151 American adults. We find that 26.2% (95% confidence interval 22.5% to 29.8%) of Americans were exposed to untrustworthy websites in 2020, down from 44.3% (95% confidence interval 40.8% to 47.7%) in 2016. Older adults and conservatives continued to be the most exposed in 2020 as in 2016, albeit at lower rates. The role of online platforms in exposing people to untrustworthy websites changed, with Facebook playing a smaller role in 2020 than in 2016. Our findings do not minimize misinformation as a key social problem, but instead highlight important changes in its consumption, suggesting directions for future research and practice.
The Revelations of Q. Dissemination and Resonance of the QAnon Conspiracy Theory Among US Evangelical Christians and the Role of the COVID-19 Crisis [Link]
Commemorative Populism in the COVID-19 Pandemic: The Strategic (Ab)use of Memory in Anti-Corona Protest Communication on Telegram [Link]
German Corona Protest Mobilizers on Telegram and Their Relations to the Far Right: A Network and Topic Analysis [Link]
Special Issue: Conspiracy Theory Theory in Social Epistemology [Link]
Book: Misinformation in the Digital Age: An American Infodemic [Link]
[Russian, untranslated] Conspiracy as ARG: Media and Game Essence of QAnon [Link]
If You’re Attending: International Communication Association 2023 Conference
We’ll be in attendance at ICA this year. As far as we’re aware, there are two QAnon-specific presentations, but there are many conspiracy theory and disinformation panels (as if anyone who is attending needs to be told that). The QAnon presentations are:
Friday, May 26th: “Japanese QAnons and Disinformation” by Kayo Mimizuka
Saturday: May 27th: “A Mixed-Methods Analysis of Americans’ QAnon Website Consumption” by Ross Dahlke, Ryan C. Moore, Peter Forberg, and Jeff Hancock
Register: The Third Annual GNET Conference [Link]
May 24th and 25th, 2023
“The Global Network on Extremism and Technology (GNET), a project of the International Centre for the Study of Radicalisation (ICSR), is delighted to announce that its third annual conference will take place 24-25 May, 2023 at King’s College London. Participants will have the option to join virtually or in-person.
“The GNET Conference aims to explore the complex and evolving relationship between terrorism, violent extremism and technology. Over one and a half days, we will host six panels featuring international experts from a range industries and disciplines.”
Transcript: The American Roots of QAnon [Part 1, Part 2]
From the author: “The following is…the speech I gave at Purdue University in early April on the uniquely American properties of the QAnon conspiracy theory. Because it wasn’t recorded, I decided to post it online, broken up into two parts because it’s really long.”
Watch: Optimizing for What? Algorithmic Amplification and Society [Link]
A two-day symposium exploring algorithmic amplification and distortion as well as potential interventions, hosted by the Knight Institute.
YouNICon: YouTube’s CommuNIty of Conspiracy Videos [Link]
Liaw Shao Yi, Fan Huang, Fabricio Benevenuto, Haewoon Kwak, and Jisun An
Abstract: …This paper seeks to develop a dataset, YOUNICON, to enable researchers to perform conspiracy theory detection as well as classification of videos with conspiracy theories into different topics. YOUNICON is a dataset with a large collection of videos from suspicious channels that were identified to contain conspiracy theories in a previous study (Ledwich and Zaitsev 2020).
IRMA: the 335-million-word Italian corpus for studying Misinformation [Link]
Fabio Carrella, Alessandro Miani, and Stephan Lewandowsky
Abstract: …we present IRMA, a corpus containing over 600,000 Italian news articles (335+ million tokens) collected from 56 websites classified as ‘untrustworthy’ by professional fact-checkers. The corpus is freely available and comprises a rich set of text- and website-level data, representing a turnkey resource to test hypotheses and develop automatic detection algorithms. It contains texts, titles, and dates (from 2004 to 2022), along with three types of semantic measures (i.e., keywords, topics at three different resolutions, and LIWC lexical features). IRMA also includes domain-specific information such as source type (e.g., political, health, conspiracy, etc.), credibility, and higher-level metadata, including several metrics of website incoming traffic that allow to investigate user online behavior.”
COCO: an annotated Twitter dataset of COVID-19 conspiracy theories [Link]
Johannes Langguth, Daniel Thilo Schroeder, Petra Filkuková, Stefan Brenner, Jesper Phillips, and Konstantin Pogorelov
Abstract: …we created a dataset of 3495 tweets with manual labeling of the stance of each tweet w.r.t. 12 different conspiracy topics. The dataset thus contains almost 42,000 labels, each of which determined by majority among three expert annotators. The dataset was selected from COVID-19 related Twitter data spanning from January 2020 to June 2021 using a list of 54 keywords. The dataset can be used to train machine learning based classifiers for both stance and topic detection, either individually or simultaneously.