copyright notice
link to the published version: in IEEE Computer, March, 2026; archive copy


accesses since February 23, 2026

FROM TROLLS TO EDGE LORDS: The Use of Social Media to Support Extreme Ideology

Hal Berghel

ABSTRACT: TOnline trolling familiar to all networked computer users. Over the past few years it has evolved into a more sinister form: edgelording.

That social media platforms have been used to foment discord, hate speech, false narratives, etc. is beyond dispute at this point, and has been recognized by traditional media outlets such as Time Magazine [1], the academic community [2], and the American Bar Association [3]. The ABA report is noteworthy for its condensed overview of key aspects of such social media abuse:

ABA KEY FEATURES OF EXTREMIST USE OF SOCIAL MEDIA [3]

Targets:

  1. People of color
  2. Immigrants
  3. Jewish people
  4. LGBTQ people
  5. Religious institutions
  6. Government buildings
  7. Abortion clinics

Nature of Messaging

  1. Publishing their plans 
  2. Engaging social media users in online conversation
  3. Using messaging that attracts a young audience
  4. Showing acts of violence
  5. Taking responsibility for terrorist acts
  6. Redirecting social media users to websites about their groups
  7. Seeking financing

Outcomes:

  1. Mistrust of institutions
  2. Rejection of facts
  3. Erosion of traditional norms
  4. Dysfunction among lawmakers
  5. Increase in violence
  6. Risk of cybercrime

Even the U.S. Department of Justice and the U.S. Government Accountability Office have weighed in on the complex interrelationship between social media and online extremism. [4][5] One interesting observation made by the GAO was that the FBI Uniform Reporting Program revealed that while bias motivations and race of offenders were consistent between online and offline crimes, a vast disparity in the nature of offenses appeared. For example, where intimidation offenses represented 25% of the total of non-Internet-related crimes, they represented 89% of the Internet-related crimes. So it appears clear that Internet resources (including social media) are the preferred venue of choice for intimidation, bullying, hate speech, and the like. This is understandable given the ease of access and relative anonymity provided by online platforms vs. tagging, defacement, vandalism, and other less subtle venues.

A useful survey of recent studies regarding the detection of extremism and ideological orientation is provided by Ravi and Yuan [6], and t wo recent studies that address the question of why people engage in violence or join violent groups is provided by Gomez, et al. [7] and Hafez [8].

RATIONALES FOR EXTREMISM

Several factors contribute to the current wave of social media-inspired extremism. Central to the mix would be politics driven by emotions like anger [9], resentment [10], and hatred. [11] William Galston refers to these emotions, along with humiliation, fear and the drive for domination as ‘dark passions.' [12] As Steven Webster has observed, since such emotions lead to voter loyalty, they are natural bedfellows of authoritarians, demagogues, and tyrants. [9] Further, when it comes to politics, there is an observed tendency for people to find negative emotions far more compelling motivators than the ideologies themselves. To paraphrase political theorist William Galston, while the dark forces in politics like extreme partisanship, anger, resentment, hatred and bigotry are not eliminable, maintaining a fully-functional democracy requires – at a very minimum - a public understanding of their effects. [12] Any effective post-truth-era Realpolitik will have to recognize the influence of such tribal realities.

Social scientists have also come to understand more about the pathways that lead to violent extremism. These include such motivational drivers as fairness and justice (as extremists interpret them), the desire to solve pressing social problems, and sundry strains of fanaticism (religious, ethnic, racist, economic, etc.). [13] But since the ‘universal values' of extremism tend toward the shallow, immature and myopic, they tend to reinforce divisiveness rather than inclusiveness.

Studies that focus on other psychological drivers of extremism, such as need, narrative, and networking, may also be found in the literature. [7] Notably, one important driver, the need to feel valued and be respected, is a strong motivator, as is the desire to abandon a dissatisfying life narrative and adopt a new one with shared values and camaraderie – even if the new narrative legitimizes violence and anti-social behavior. There is no shortage of social science research dedicated to the topic of the pathways and motivations that lead to the support of, or participation in, extremist behavior. Just one case, the January 6, 2021 attack on the US Capitol, led to over 1,500 federal prosecutions, of which 1,126 were sentenced and 64 received prison time. [14] In any of its myriad manifestations, the motivations behind extremism all have their roots in the self-aggrandizement or, in the words of Saint Augustine, the principle of libido domandi (lust for domination).

New to the 21 st century are two technical innovations: social media and Generative AI. These technologies combine to reduce the friction on the pathways to extremism and significantly amplify the attendant social distress. We frame the remaining discussion around two contributors: trolls and edge lords.

TROLLS AND EDGELORDS

In Scandinavian mythology, trolls are mischievous and sometimes dangerous creatures that live underground. In cyber mythology, trolls disrupt normal, customary, and polite information exchange in order to provoke a response. These two mythologies have closely connected objectives: interference with normal human interaction.

In its simplest form, the phrase ‘online trolling' denotes the practice of a nonymous interruption of normal and customary information exchange in order to lure the recipient into reacting to a message. [15] This activity can range from simple interference to a psychopathological expression that reflects personality disorders on the part of the creator (e.g., online bullying, sadism, schadenfreude, sociopathy, hate following, acting as agent provocateur, etc.). [16] [17] [18] [19] In our view, any discussion of trolling that fails to take into account trolling pathology is necessarily incomplete. The dark side of trolling is a significant factor in the Internet's negative space and must be recognized as such. [20]

Trolling is a many-faceted social phenomenon. It can be used to describe a range of behavior from that done ‘for the lulz' to that involving dangerous speech [21] as well as pathological, anti-social behavior. Edgelords, on the other hand, occupy a special space in the broader trolling expanse. While Edgelord behavior may also qualify as inflammatory and distracting, the purpose is far more extreme. Edgelords are more aggressive in their advocay of extreme, anti-social views. Their goal is to shock and offend rather than embarrass or ridicule. Whereas trolls might seek to make fun of ‘normies', edgelords attack the very essence of normie-dom. The Cambridge dictionary offers this definition:

“edgelord: someone who intentionally expresses opinions that are likely to shock or offend people , especially on the internet , as a way of making others notice or admire them.” ( https://dictionary.cambridge.org/us/dictionary/english/edgelord )

Rosenblat and Barnes elaborate:

“ In the online realm, where speakers are often anonymous and audiences diffuse, contextual knowledge of threat actors, their rhetorical tendencies, and usage patterns, is key to discerning intent and estimating likelihood of harm. This is particularly challenging in closed platforms, where “in-group” language often leans on irony and memes, creating ambiguity about whether a post is an actionable command or an example of “edgelording”— deliberately using controversial,shocking, or taboo language to garner digital attention.” [22]

As such, edgelords pick up where pathological trolls leave off and are much more likely to fall under the rubric of ‘threat actor' than nudnik. While we admit that there is no clear dichotomy between pathological trolls and edgelords, we emphasize that the absence of dichotomy does not imply a lack of distinction. The degree of anti-social behavior associated with edgelording, and the potential to encourage violence, is both significant and noteworthy. There is no shortage of examples of such aggressive, anti-social edgelord behavior. (cf., https://www.reddit.com/r/explainlikeimfive/comments/5lr2tu/eli5_what_exactly_is_an_edgelord/ )

Particularly alarming is that edgelording is increasing at a time when there is increasing public antipathy to regulation of false, anti-social, or hostile online content – even when it seems to encourage violence. [23] Absent dissenting social pressure, this lack of regulation amounts to enabling behavior. While fomenting hatred and violence is nothing new, the effectiveness of edgelording clearly exacerbates the problem. This may be illustrated by comparison of the effects of extremist messaging before and after the advent of social media. One may get a sense of this difference by comparing reports of the pro-Nazi Columbians in 1947 [24] with the Groypers in 2025. [25] Social media in no small way amplifies the effects of the messaging, as it is an ideal ecosystem for galvanizing social outliers. [26]

The Democracy-taskforce.org website summarizes the relationship between threat actors and social media succinctly”

Domestic and foreign terrorists and other “threat actors” routinely use social media to publish their plans, engage users in conversation, attract young and impressionable users, draw users to their websites, seek financing, publicize violent acts, and take responsibility for terror attacks. For example, social media was used to coordinate and publicize the January 6, 2021 attack on the U.S. Capitol, the 2021 Patriot Front march, the 2018 Pittsburgh synagogue shooting, several anti-police protests in 2020, and the 2019 New Zealand mosque attack, among other incidents. [3]

Trolling and edgelording are symptoms of the detachment  of  the public information infrastructure from reality, caused in no small part by computing and network technology, which in turn is in no small part due to the influence of social media and generative AI.

REMEDIATION

Global democracy is well past the point of ‘peak truth,' leading to the tipping point at which the detachment of the public information infrastructure from reality is so complete that people become information-averse and more sympathetic to authoritarianism. Alexei Yurchak calls this phenomenon hypernormalization. [27] Deficient epistemological grounding, Yurchak argues, creates a level of confusion and uncertainty that inevitably embraces authoritarianism: if no information sources are reliable, the advice of the controlling elite is as good as anything else, so take it. Recent studies have shown that 75% of the world now lives under some form of authoritarianism. [28][29] [30] This mission slip from electoral democracy to electoral authoritarianism (what political scientists call illiberal democracy or competitive authoritarianism) is both noteworthy and alarming [31] [32] and figured prominently in the 2021 Nobel Peace Prize. [ https://www.nobelprize.org/prizes/peace/2021/summary/ ]

Given the close interplay between social media platforms and extremist groups, and the resulting acceleration of hate-speech and violence, it is incumbent on society to seek some sort of remediation. The proposals suggested by the ABA referenced above provide a useful starting point. [3]

  1. Improve coordination between law enforcement, social media platforms, and individual citizens to reduce political extremism online.
  2. Scrutinize niche social media platforms, where extremism can be more common.
  3. Promote media literacy and critical thinking education.

  4. Maintain the new Homeland Security branch dedicated to domestic terrorism and support its social media tracking.
  5. Protect social media users' rights. Focus oversight on curbing violence and dangerous activities without repressing mere political dissent.
  6. Support news distribution and consumption outside of social media platforms.

While all are worthy of consideration, I suggest that only 3 has any chance of long-term success. Law enforcement and government agencies have historically had little immediate effect in changing social awareness and influencing social norms. Further, high-tech corporations derive enormous revenue from the use of social media platforms and have been proven to be unwilling to do anything that might interfere with that success. To paraphrase Upton Sinclair, it is difficult to get a man to understand something when his salary, power and influence depends upon his not understanding it. The same principle applies to high-tech corporations. I suggest that effective disincentives to the continued use of social media to promote anti-social and extremist behavior are unlikely to be produced by governments and corporations: governments are singularly ineffective in producing enduring social change in the short term, and corporations have no economic or political reason to do so.

That leaves us with media literacy and critical thinking, which in turn leads us to the twin notions of a diversified, well-rounded education, combined with acceptance of the principle that general education, including critical thinking, is a public good – precisely those educational principles that are currently under attack. I am completely confident is predicting that a continuation of the current trend toward educational models based on indoctrination, and the mission slip in higher education toward emphasis on specialty occupations and job training will prove to be wrong-headed. The message above the chalkboard in K-12 classrooms should include: “Don't feed the trolls and Ignore the edgelords,” and this message should be reinforced in classroom discussions. Until this message takes hold, trolling and edgelording will remain pervasive.

REFERENCES (online links verified 12/10/25)

[1] B. Hoffman and J. Ware, Online Extremism is Decades in the Making, Time Magazine, February 5, 2024. (online: https://time.com/6551865/extremist-social-media/ )

[2] A. Shaw, Social Media, extremism, and radicalization, Science Advances, 9, pp. 1-2, 30 Aug 2023. (online: https://www.science.org/doi/epdf/10.1126/sciadv.adk2031 )

[3] Social Media and Political Extremism, a report prepared for the American Bar Association Task Force for American Democracy by Virginia Commonwealth University, February 28, 2023. (online: https://www.americanbar.org/groups/public_interest/election_law/american-democracy/our-work/democracy-database/social-media-political-extremism/ )

[4] Five Things About the Role of the Internet and Social Media in Domestic Radicalization, NCJ 307323, National Institute of Justice Report, December, 2023. (online: https://nij.ojp.gov/topics/articles/five-things-about-role-internet-and-social-media-domestic-radicalization )

[5] Online Extremism: More Complete Information Needed about Hate Crimes that Occur on the Internet, US GAO Report to Congressional Requests, GAO-24-105553, January, 2024. (online: https://www.gao.gov/assets/gao-24-105553.pdf )

[6] K. Ravi and J-S Yuan, Ideological orientation and extremism detection in online social networking sites: A systematic review, Intelligent Systems with Applications, 24, (2024). (online: https://www.sciencedirect.com/science/article/pii/S2667305324001303?via%3Dihub )

[7] A. Gomez, M. Martinez, F. Martel, L. Rodriguez, A. Vazquez, J. Chinchilla, B. Paredes, M. Hettiarachchi, N. Hamid and W. Swann, Why People Enter and Embrace Violent Groups, Frontiers in Psychology, 11:614657, 2021. (online: https://pmc.ncbi.nlm.nih.gov/articles/PMC7817893/ )

[8] M. H afez and C. Mullins, The Radicalization Puzzle: A Theoretical Synthesis of Empirical Approaches to Homegrown Extremism. Studies in Conflict & Terrorism , 38 :11, pp. 958–975, 2015. (online: https://www.tandfonline.com/doi/full/10.1080/1057610X.2015.1051375#d1e144 )

[9] S. Webster, American Rage: How Anger Shapes Our Politics, Cambridge University Press, Cambridge, 2020.

[10] K. Cramer, The Politics of Resentment: Rural Consciousness in Wisconsin and the Rise of Scott Walker, U. Chicago Press, Chicago, 2016.

[11] S. Kaufman, Modern Hatreds: The Symbolic Politics of Ethnic War, Cornell University Press, Ithaca, 2001.

[12] W. Galston, Anger, Fear, Domination: Dark Passions and the Power of Political Speech, Yale University Press, New Haven, 2025.

[13] L. Grigoryan, V. Ponisovskiy, S. Schwartz, Motivations for violent extremism: Evidence from lone offenders' manifestos, Journal of the Society for the Psychological Study of Social Issues, 79:4, pp. 1440-1455, 2023. (online: https://doi.org/10.1111/josi.12593 )

[14] The Jan. 6 attack: The cases behind the biggest criminal investigation in U.S. history, online report, NPR Investigation, updated March 14, 2025. (online: https://www.npr.org/2021/02/09/965472049/the-capitol-siege-the-arrested-and-their-stories )

[15] H. Berghel, Trolling Pathologies, Computer, 51:3, pp. 66-69, 2018. (online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8320219 )

[16] N. Anderson, S., Science confirms: Online trolls are horrible people (also, sadists!), Ars Technica, 2/20/2014. (online: http://arstechnica.com/science/2014/02/science-confirms-online-trolls-are-horrible-people-also-sadists/ )

[17] P. Brubaker, D. Montez, S. Church, The Power of Schadenfreude: Predicting Behaviors and Perceptions of Trolling Among Reddit Users, Social Media + Society, 7:2, 2021. (online: https://journals.sagepub.com/doi/epub/10.1177/20563051211021382 )

[18] J. Ouwerkerk and B. Johnson, Motives for Online Friending and Following: The Dark Side of Social Network Site Connections, Social Media + Society, 2:3, 2016. ( https://journals.sagepub.com/doi/10.1177/2056305116664219 )

[19] K. Bond, Why do we ‘hate-follow' people on social media?, Independent, 30 April 2021. (online: https://www.the-independent.com/life-style/the-psychology-behind-why-we-hatefollow-people-on-social-media-b1837751.html )

[20] G. Fuller, C. McCrea and J. Wilson, Trolls and The Negative Space of The Internet, The Fibreculture Journal, v. 22, Open Humanities Press, Sydney, 2013. (online: http://fibreculturejournal.org/wp-content/pdfs/FC22_FullIssue.pdf )

[21] Dangerous Speech: A Practical Guide, Dangerous Speech Project, April 19, 2021. (online: https://www.dangerousspeech.org/libraries/guide )

[22] M. Rosenblat and L. Barnes, Digital Aftershicks: Online Mobilization and Violence in the United States, NYU Stern Center for Business and Human Rights, New York, October, 2025. (online: https://bhr.stern.nyu.edu/wp-content/uploads/2025/10/NYU-CBHR-Digital-Aftershocks_Oct-29-FINAL.pdf )

[23] C. St. Aubin and M. Lipka, Support dips for U.S. government, tech companies restricting false or violent online content, Pew Research Center, April 14, 2025. (online: https://www.pewresearch.org/short-reads/2025/04/14/support-dips-for-us-government-tech-companies-restricting-false-or-violent-online-content/?utm_source=substack&utm_medium=email )

[24] E. Folliard, Columbians cloud Atlanta with an aura of Nazism, Washington Post, Dec. 1, 1946. (online: https://www.pulitzer.org/article/new-and-spectacular-hate-organization )

[25] A. McPhee-Browne, Nick Fuentes is a master of exploiting the current social media opportunities for extremism, The Conversation, November 24, 2025. (online: https://theconversation.com/nick-fuentes-is-a-master-of-exploiting-the-current-social-media-opportunities-for-extremism-269776 )

[26] H. Berghel, Social Media, Cognitive Dysfunction, and Social Disruption, Computer , 57:5, pp. 118-124, 2024. (online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10517741 )

[27] A. Yurchak, Everything Was Forever, Until It Was No More: The Last Soviet Generation, Princeton University Press, Princeton, 2005.

[28] A. Nehme, Autocracies outnumber democracies for the first time in 20 years, Democracy Without Borders report, 27, Mar 2025. (online: https://www.democracywithoutborders.org/36317/autocracies-outnumber-democracies-for-the-first-time-in-20-years-v-dem/ )

[29] M. Nord, D. Altman, F. Angiolillo, T. Fernandes, A. Good God, , and S. Lindberg, Democracy Report 2025: 25 Years of Autocratization – Democracy Trumped?, V-Dem Institute, 2025. (online: https://www.v-dem.net/documents/54/v-dem_dr_2025_lowres_v1.pdf )

[30] Two decades of decline in the global state of democracy, online report, Demo Finland, 9.4.2025. (online: https://demofinland.org/en/two-decades-of-decline-in-the-global-state-of-democracy/ ]

[31] S. Levitsky, The New Authoritarianism, The Atlantic, February 10, 2025. (online: https://www.theatlantic.com/ideas/archive/2025/02/trump-competitive-authoritarian/681609/ )

[32] Violence, redistricting, and democratic norms in Trump's America, Bright Line Watch September 2025 survey, Bright Line Watch, 2025. (online: https://brightlinewatch.org/violence-redistricting-and-democratic-norms-in-trumps-america/ )