How Memes Became Tools for Online Misinformation

Written by hacktivists | Published 2025/05/06
Tech Story Tags: hacktivism | social-media-propaganda | internet-freedom | digital-activism | political-activism-online | internet-trolling | digital-democracy | information-warfare

TLDR Hacktivists pioneered memes and trolling as tools of dissent, but these tactics were later co-opted by troll farms and political groups to spread misinformation at scale—fueling chaos during major events like Brexit, US elections, and the COVID-19 pandemic.via the TL;DR App

Authors:

(1) Filipo Sharevski, DePaul University;

(2) Benjamin Kessell, DePaul University.

Table of Links

Abstract and Introduction

2 Internet Activism and Social Media

2.1 Hashtag Activism

2.2 Hacktivism

3 Internet Activism and Misinformation

3.1 Grassroots Misinformation Operations

3.2 Mainstream Misinformation Operations

4 Hacktivism and Misinformation

4.1 Research Questions and 4.2 Sample

4.3 Methods and Instrumentation

4.4 Hacktivists’ Profiles

5 Misinformation Conceptualization and 5.1 Antecedents to Misinformation

5.2 Mental Models of Misinformation

6 Active Countering of Misinformation and 6.1 Leaking, Doxing, and Deplatforming

6.2 Anti-Misinformation “Ops”

7 Misinformation Evolution and 7.1 Counter-Misinformation Tactics

7.2 Misinformation Literacy

7.3 Misinformation hacktivism

8 Discussion

8.1 Implications

8.2 Ethical Considerations

8.3 Limitations and 8.4 Future Work

9 Conclusion and References

3.1 Grassroots Misinformation Operations

Hacktivists, perhaps inadvertently, authored or gave popularity to the most utilized primitives for creating, propagating, amplifying, and disseminating misinformation - trolling and memes. This negative externality is unfortunate as trolling and memes were initially used by Anonymous against what they perceived a “misinformation campaign” by the Church of Scientology [75]. The “anon” members on 4chan.org practically hijacked the term “troll” – initially meaning provoking others for mutual enjoyment – to abusing others for members’ own enjoyment by posting upsetting or shocking content (usually on the \b channel of 4chan.org [21]), harassing users (e.g. mocking funeral websites [12]), and spreading rumors [62]. What Anonymous did for the “lulz” (a brand of enjoyment etymologically derived from laughing-out-loud (lol)), nonetheless, showed the ease with which one could exploit the Internet technologies to be impolite, aggressive, disruptive, and manipulative to users’ emotional states [21].

Trolling initially came in textual format as comments to posts, bulletin boards, and websites “deindividualized” people’s lived experience for the “lulz” [12]. Gradually, hacktivists popularized a multimedia format of trolling or “memes,” where textual commentary is superimposed over well-known imagery, typically representing different forms of power, such as political leaders, the police, and celebrities [76]. Memes, perhaps, were the actual rite of passage to true hacktivism – moving away from the early LOLCats – as they seek to deconstruct the power represented, contest censorship, and provide political commentary [87]. Memes as content were put to hacktivist use en masse in operations like “Troll ISIS day,” where Anonymous proliferated memes with rubber-duck heads or rainbow stripes to ridicule ISIS propaganda imagery and disinformation narratives on Twitter [76]. Spread together with satirizing hashtags (e.g. #Daeshbags), the trolling memes achieved a cultural virality that brought hacktivists’ into the mainstream discourse online [92]. What the hacktivists did with the memes nonetheless, showed the ease with which one could disrupt, challenge, reimagine, and appropriate new political contexts by harnessing the virality and visibility of content spread on social media [84].

3.2 Mainstream Misinformation Operations

The hacktivists’ playbook of trolling and meme dissent, though initially targeted against misinformation, was skillfully appropriated for crafting and disseminating misinformation from 2014 onward, coinciding with the period of hacktivist inactivity [11]. The playbook alone, at first, was insufficient to the objectives of widespread political disruption as it necessitated a support network of individuals and/or accounts on social media for any alternative narratives to gain traction. But the “appropriators” – privy of prior campaigns of disinformation and with the support of nation-state governments [113] – need not to look further as “sock puppet” accounts were already utilized for spreading political falsehoods (e.g., Martha Coackey’s “twitter bomb” disinformation campaign [85]). Having all the ingredients for exploiting the virality of social media and users’ familiarity with emotionally-charged discourse, the “appropriators” established troll farms in the wake of the UK’s Brexit campaign and 2016 US elections [73,135].

The “army” behind the troll farms were particularly clever to append their social bots with “sock puppet” accounts that imitate ordinary users to systematically micro-target different audiences, foster antagonism, and undermine trust in information intermediaries [6]. Playing both sides in the emotionallycharged discourse already unravelling on social media, the troll farms posed as authentic, cultural competent personas (e.g. so-called “Jenna Abrams” account [130]), as well as vocal supporters of hashtag activism (counter) movements (e.g. BlackToLive in #BlackLivesMatter and SouthLoneStar in #BlueLivesMatter [119]). They also appropriated the hashtag hijacking (e.g., #elections2016 and #ImVotingBecause tagging of quotes about Donald Trump and against Hillary Clinton [3]), hashtag co-opting (e.g. #BlackGunsMatter and #syrianlivesmatter [29]), and counter hash tagging (e.g. #NoDAPL against the Dakota Access Pipeline [45]). The troll farms even had the audacity to impersonate the Anonymous themselves (e.g. the @_anonymous_news impersonation of the “Your Anonymous News” twitter account [20]).

The “meme game” of the troll farms was equally sophisticated and appended the initial success of their operations [82]. Testing the waters with war-related memes regarding the opposition/support of the conflict in Syria [29], the troll farms capitalized on both the meme trolling and the Internet activism by spreading political memes through their Blacktivist social media accounts and co-opting Wikileaks in exploiting the leak of sensitive documents from the Democratic National Committee (DNC) [71]. Memes were also used to amplify conspiracy theories (e.g. QAnon, Pizzagate, and the murder of Seth Rich [132]), Texas secessionism (e.g. if Brexit why not #Texit [50]), and direct attacks (e.g. crooked Hillary [46]).

While the initial campaigns of the troll farms have been tracked, exposed, and brought into attention [29,46], the social media discourse never really recovered from the watershed appropriation of the Internet activism for the purpose of conducting information operations [111]. Worse, the troll farm brand of political dissent was adopted by populist accounts that were keen on disseminating misinformation beyond just politics [51]. The trolling pandemonium spilled out of control with the COVID-19 pandemic as rumors, conspiracy theories, fake news, and out-of-context spins plagued the social media by hijacking the dominant hashtags like #COVID19, #coronavirus or #DoctorsSpeakUp [13], co-opting hashtags like #plandemic [60] and counter hash tagging with hashtags like #COVIDIOT [110]. Memes were distributed in conjunction with deepfake videos on platforms like YouTube [96] and TikTok [8] as well as blatant fake news on alt-platforms like Gab [19] to effectively reach a self-perpetuating bedlam of misinformation Internet counter-activism.

This paper is available on arxiv under CC BY 4.0 DEED license.


Written by hacktivists | Using digital prowess to disrupt, reveal, and reform.
Published by HackerNoon on 2025/05/06