Extremist Action in Digital Gaming Spaces: The Role of Identity Fusion

Radicalisation, Extremism, and Digital Games

Over the last few years, the topics of extremism and radicalisation have reached the forefront of concern in gaming communities. Specifically, this concern has centred around the potential for extremists to leverage the extensive reach of games and gaming cultures for recruitment, propaganda dissemination, signalling, networking, and mobilisation We have seen these concerns voiced in the media, as evidenced in the 2021 article featured in Wired entitled ‘How Roblox Became a Playground for Virtual Fascists’, but also from organisations like the Department of Homeland Security and the UN Office of Counter Terrorism who have both held workshops and produced resources specifically focused on the intersection of extremism and digital gaming spaces.

As it stands today, most of what we know about extremism in games is that extremist actors and content are present in digital gaming and gaming-adjacent spaces. In 2021, the Institute of Strategic Dialogue (ISD) released a report looking at the presence of the extreme right, specifically, on gaming-adjacent platforms. They found that Discord, one of the most popular third-party chat servers for gaming communities, serves as a hub for right-wing socialising and community building. In this report, ISD also reported the presence of white supremacist content on Twitch, an online streaming platform largely centred on gaming content, which was found to be broadcasting content promoting white supremacist worldviews that could be found with relative ease. Steam, an online platform where one can buy, play, and create games, was also found to house a range of servers created for far-right and Neo-Nazi groups. 

Within this landscape, there are also games created specifically for the propagation of extremist ideology. For example, Hatred, a mass shooter “genocide crusade” simulation game for the PC released in December 2014. This game is still currently available to download, play, and share on Steam. Over-the-counter games can also be modified to reflect more extreme worldviews, for example making a Hitler-led Nazi Germany a playable nation in a modded version of the Civilization series.

The presence of extremist content in gaming spaces was further explored in 2019 by the Anti-Defamation League (ADL), which reported that one in ten young gamers aged 13 – 17 are exposed to white supremacist ideologies in games. In 2021, this number rose closer to 10% of all game players, but it remains unclear if this disparity across time is due to differences in sampling or actual changes in the landscape. It is also important to note that in this ADL report, players reported experiencing exposure to the most explicit forms of white supremacist ideologies across PC, console, and mobile platforms. It is likely that these observations are just the tip of the iceberg.

While we know extremist content, radicalisation, and recruitment are occurring in games, there is still little understanding of how and why this may be of particular concern among gaming communities. In a new research article I, alongside my colleagues at the University of Texas, Alexi Martel and William B. Swann, explore one potential mechanism through which extremist ideology may permeate gamers and gaming communities: identity fusion.

Gamer Cultures and Identity Fusion

Identity fusion is a psychological construct that refers to a sense of oneness felt between oneself (i.e. individual identity) and a group (i.e. social identity). Typically, we have our individual personal identity (the “I’s”) and our social identities (the “we’s”) and these identities are separate. But sometimes individual and social identities fuse together when the borders between the two become more porous – this is identity fusion. hen fusion happens, an individual is more willing to enlist the personal self in service of the group. That is, more willing to make personal sacrifices for group goals. 

Traditionally, fusion happens when people feel a deep emotional bond to their social group that forms over time through shared experiences, shared norms, and close friendship bonds. As such, fusion research has typically focused on nationalist and military groups – “Once a Marine, always a Marine” comes to mind. However, the above characteristics are also central to gaming experiences and communities. 

As such, we hypothesised that the distinctive environment of gaming spaces and game players may also foster identity fusion, particularly as many people seek gaming communities for social connection. In fact, a major appeal of gaming communities is their capacity to offer a sense of closeness, belonging, and security for individuals who need it the most. However, gaming cultures can also be spaces in which hateful, harassing, and toxic behaviours and language are commonplace. This includes racism, misogyny, and extremist ideologies. Gamer communities, therefore, represent “a double-edged sword”; on the one hand, they may provide a sense of social connection and purpose, but on the other hand, gamers may be exposed to hateful speech, social toxicity, and extremist propaganda. In the worst-case scenario, those most entrenched in the community may be lured into embracing extremist beliefs, leading them down the path to radicalisation.

Recent Research

To explore this possibility through the lens of identity fusion, we conducted a series of three studies. The results indicated that fusion with gamer cultures does exist and a fused identity with gamer cultures reflects a profile associated with extreme and antisocial behaviour.

Specifically, we found that a fused ‘gamer’ identity was associated with the willingness to fight/die (for other gamers), recent aggressive behaviours, Machiavellianism (a personality trait characterised by interpersonal manipulation, being deceitful, cynical, and lacking morality), narcissism, psychopathy (lack of empathy), sexism, racism, and the endorsement of beliefs and policies centred on ideas of white nationalism.

Notably, demographic factors including age, gender and years spent gaming had no effect on these outcomes, but gameplay time did. The relationships between fusion and the outcomes discussed above were also found to be significantly stronger among those who played primarily multiplayer online, as compared to single-player offline. This suggests that the play time measure here is more about exposure to gaming cultures than violent or extreme content. That is, the more time you spend immersed in potentially ‘ toxic gaming cultures’, the greater the likelihood you will be exposed to extreme ideologies, and the more opportunity you have to internalise those beliefs and endorse extreme behaviours. These effects were also found to be magnified among players who primarily engage in more socially toxic gaming spaces (e.g., Call of Duty) than less socially toxic gaming spaces (e.g., Minecraft). These two games were chosen as the communities of these games are widely acknowledged to exist on opposite ends of the toxicity spectrum. Call of Duty has been specifically documented as being particularly high in terms of its toxicity, with Call of Duty fans being rated the “most toxic in the gaming world” based on the ratio of positive to negative language use on social media. In contrast, the community of Minecraft has a particular reputation for being helpful, welcoming, and friendly, winning the Golden Joystick Award in 2020 for Best Gaming Community. However, additional research is needed to further tease out the impact of game content (i.e., violent, competitive) versus social environment (i.e., more or less social toxicity) on identity fusion and extreme outcomes. Taken together, the findings of this work provide a foundation to better understand the potential psychological processes unique to gaming that may bolster radicalisation within digital gaming communities. 

We know that digital games are performing a socialising function within the wider context of radicalisation as cultural assets of influence with their social elements, content, and cultures. We also know that extremists are recruiting, fundraising, signalling, radicalising and disseminating propaganda in these spaces across platforms, indiscriminate of age or gender. With this work, we hope to guide future efforts within this space to consider the role of gamer identity to better understand the processes underlying games as cultural assets of influence and to guide the development of proactive solutions. 

Rachel Kowert, Ph.D is a research psychologist, the Research Director of Take This, and a science content creator on YouTube Psychgeist. She is a world-renowned researcher on the uses and effects of digital games and, in her current work, she serves as one of the primary investigators on the first grant-funded project from the Department of Homeland Security about games and extremism. 

Twitter: @DrKowert

This article was originally published in conjunction with our partner, GNET, as an Insight on 21 November 2022

More EGRN Insights

More than Sports: Building Resilience against Extremism in Esports

In a world where young people spend most of their time online, gamers are at risk of encountering radicalizing content more often than others. The United States Esports Association has taken up the challenge of addressing these issues through competency-building initiatives aimed at combatting violent extremism. Read on to find out more about the potential risks of esports and the Association’s innovative solutions.

Preventing Extremist Violence Using Existing Content Moderation Tools

Accurate content moderation can save lives by acting as an early warning system about the risks of offline extremist violence and removing the fuel that incites it. With the explosion in automated content moderation approaches, any number of widely accessible automated detection tools can be used on known violent extremist user-generated content to improve a platform’s detection methods through fine-tuning and customization. Multiple AI-based approaches to detection can identify users, conversations, and communities, signaling a high likelihood of extremist violence. With better real-time detection, platforms can be empowered to break up harmful and criminal communities, helping to damage online influence processes that lead to extremist violence.