Communities are essential in combating extremism. This is because it influences the communication culture in a game and is also very important in curbing problematic behaviour.
Communities are an integral part of every game today. Exchanges in digital and physical channels are part of today’s fan and pop cultures. In addition to the actual gaming experience, they are an expression of cohesion and interest in a common theme: the passion for a particular game or game culture, in general, holds them together.
Healthy communication culture, strong community
A community consists of different actors. This diversity is reflected in how a healthy communication culture is cultivated. Ideally, communication in the game community is respectful, age-appropriate, and promotes fair play.
Both bottom-up and top-down initiatives characterise the communication culture in the community. Two goals are at the forefront: first, to create an environment where newcomers and experienced gamers have positive experiences inside and outside the game. Secondly, the initiatives guide game developers and gamers to ensure that gamers are exposed to age-appropriate content.
Game ratings, such as the PEGI (Pan European Game Information), are good examples of promoting positive community communication based on a self-declaration system. The categorisation based on PEGI helps to determine the suitability of a game, especially for children and young people (Game age ratings explained). This is important because when children “engage with things they are not emotionally ready for, it can have long-term effects on their well-being” (Dr Linda Papadopoulos).
Furthermore, it is essential also to consider in-game communication. People with dishonest intentions can always use a multiplayer game with a low age rating to contact minors or vulnerable people. It is essential to know that games played online with other gamers are not affected by these classifications. Committing to a healthy communication culture on different levels is crucial in countering the growing toxicity problem.
GG! Good Game! Do’s & Don’ts – unwritten rules
Unwritten rules of conduct and communication contribute to a healthy communication culture. These are based on the gamers’ own experiences and contribute to a respectful interaction among gamers, or in the vocabulary of games: GG!: Good Game!
Interestingly, despite the variety of games available, these do’s & don’ts all have a common foundation. They help to create an inclusive environment by emphasising patience and understanding for newcomers. At the same time, aggressive language, cheating and leaving in the middle of a game are explicitly mentioned as actions to be avoided and are considered unfair.
Here are some general rules for gamers of all ages:
Do’s | Don’ts |
Play by the rules | Cheat |
Give new players a chance to learn the ropes | Set yourself above newcomers (Noobs) |
Learn how to mute other players or yourself | Send spam in chat |
Highlight the winners of the game (GG) | Insult the winners after losing the game |
Toxicity as a problem
Most of the community and gamers behave correctly and do not pose a problem. But unfortunately, some people or groups stand out negatively and poison the gaming experience of entire communities.
Despite attempts to foster a positive and respectful communication environment, online gaming communities are not immune to social problems. For example, hate speech and extremism are common issues that can be seen across different platforms. Additionally, the language used in games is always evolving. Sometimes this evolution can either reduce or eliminate the source of harm or signal that the behaviour has become a part of the culture. For example, the phrase “Kill yourself!” may have evolved to “KYS” as an expression.
Toxicity can take many different forms, such as ‘name-calling, racism, stalking, hate speech, insults, explicit language, flaming (hostile online interactions involving an exchange of abusive messages or flames between users), physical threats and doxing (disclosure of sensitive information about a person) (Ewalt Blair). Racism and sexism are also mainly used to spread toxicity and contaminate gay communities (Ghoshs).
Disruptive or destructive behaviour can go unchecked if no resources are available to deal with such situations. Bystanders can become unintentional accomplices when left without a choice or feel pressured not to intervene. When such behaviour is normalised, gamers are less likely to turn against abusers for fear of being targeted themselves.
What can cause toxicity?
Communication discord
The newness of online environments and anonymity in these spaces has resulted in a lack of established social norms within video games, unlike those long established over time in face-to-face interactions. This lack of norms can lead to a lack of self-regulation among gamers and cause confusion and conflicts.
The absence of nonverbal cues in online gaming exacerbates these challenges. Additionally, when players do not have to deal with the social repercussions of their actions, they are more likely to engage in inappropriate behaviour. The anonymity provided by online spaces can also lead gamers to assume that everyone in their group shares the same views and values, which can further decrease empathy and devalue individuality.
Online games are not typically created to facilitate ongoing interactions between players, making it difficult to establish relationships and trust with other gamers. Instead, players often engage in battles and competitions where one mistake can harm someone they’ve just met. Interactions tend to become hostile quickly and without reason, leaving little opportunity to build trust and benefit from the favourable exchange.
Looking closer at undesirable behaviour
Undesirable behaviour can have multiple roots, such as a lack of understanding of cultural differences, immaturity, the desire to provoke others, conforming to community norms, retaliation, or taking justice into one’s own hands. In addition, how players behave in a game is influenced by their understanding of what is expected of them and their fellow players.
Below are just some effects of such behaviour towards gamers:
- Gamers may feel harassed by other players who can access their game statistics or additional personal information. Additionally, third-party websites collect data from API interfaces and share information about gamers.
- Revealing the specifics of behavioural software, such as punishment systems, can encourage players to push the limits of the software, which can lead to using it as a tool to harass others or to develop automated systems for large-scale attacks.
- Less experienced or new players may be blamed more, as their actions are more visible, or they are assumed to make more mistakes.
Does the dose make the poison? What we can do against toxicity
Games, like other media, can be used as a platform to spread hateful content and extremist propaganda. Game developers and publishers are therefore taking active steps against this. They want to ensure that community members are protected from toxicity and are not harassed during the gaming experience. However, emerging toxicity harms the game and can even damage the reputation of the game developers. In the worst case, gamers turn away from the game, causing it to lose popularity in the community.
In most cases, a game’s community gives no cause for complaint. Game developers and publishers think they are safe and do not take a decisive approach to prepare for an emergency. At the same time, it is important not to intervene hastily in gamers’ communication and restrict freedom of expression.
However, the commitment and effort against emerging toxicity should not be underestimated. For example, the gradual transition from healthy to negative communication makes it difficult to act decisively on the toxicity problem. Furthermore, due to the vast amounts of texts, images and videos, and content created, daily monitoring and, if necessary, deleting them requires many resources. Without it, there is a risk of economic damage and a risk of damage to the personal reputation of the game developer.
Countering Hate speech and extremism
There is no one-size-fits-all solution against hate speech and extremism. Instead, the problem must be tackled at different levels and by combining several instruments. Below are some approaches in more detail.
Legislation
The first level of combating hate speech and extremism can be done through state laws and regulations. They set the rules on what harmful content are prohibited and which are not. However, it is essential to note that games are usually played in numerous countries with different political systems. For example, in certain countries, laws restrict freedom of speech to oppress citizens. As a result, content may be allowed in one country but considered hateful in another. These legal differences pose a challenge in implementing this approach.
Guidelines
Game developers and publishers should be aware of the scope of the problem as early as the development phase. The same applies to forums, chat rooms and moderation of content/communities. Guidelines for the community are an excellent way to start addressing the issue.
A Code of Conduct can establish clear guidelines and expectations for gamers and game developer staff and provide a framework for consistent enforcement. It should be easily accessible and clearly presented, using simple language, and providing illustrative examples rather than an exhaustive list of rules. This will help gamers understand the expectations and show them how to adjust their behaviour if they make a mistake. It should also capture the spirit of the rules and not just focus on listing specific dos and don’ts.
In-Game resources
Provide gamers with the necessary resources to report instances of toxicity in-game actively. It is important to provide gamers with the resources needed to report instances of toxicity in-game actively. The ease of access to reporting features, the variety of reporting categories, and the language used in the reporting process all play a role in gamers’ expectations. Showing gamers that their reports are valued, and meaningful can help to increase accountability. Providing specific, actionable feedback in the form of warnings or penalties can also make it clear to gamers what their responsibilities are, reducing the number of appeals and decreasing the rate of repeat offenders.
Content moderation
Content moderation is a way to identify and remove hateful and extremist content. Its use mainly covers social media and forums and, to a lesser extent, games. An example of such is the use of word filters, where specific words or phrases are marked as to be immediately removed. At the same time, such blocked words can also limit how gamers with a particular marginalised can identify with one another and further burden these groups.
A disadvantage is the difficulty of recognising nuances in communication. It is difficult to categorise some behaviours as either problematic or acceptable. There is a blurring of the lines between activism and harassment, friendly banter and offensive mockery, and banter between strangers and friends.
Education
How individuals are introduced to digital environments and the actions modelled by caregivers can significantly affect their ability to navigate these environments and interact appropriately and respectfully with others. Young people must be taught how to be respectful and empathetic towards others, handle frustration, and understand the importance of teamwork in online games. Additionally, it is essential to provide a safe space for them to discuss their online experiences. The norms and expectations of specific online communities and genres can influence acceptable behaviour. Therefore it is necessary to promote diversity in online environments while allowing for self-expression and a sense of community.
Standing up against toxicity Detect offensive content, flag, and identify user sources before disciplinary action is taken. Implement systems to reinforce polite behaviour and set community standards for how gamers interact with each other during the gaming experience. Increase transparency between game developers and gamers. Regularly revise reporting systems to more accurately identify and, if necessary, punish behaviour that gamers consider toxic. Encourage parents to talk to their children about the issue early and regularly so that they can recognise hate speech and know what to do if an incident occurs. Where appropriate, choose a tool to filter swear words and moderate content with child-friendly game settings. Reward positive and constructive behaviours. Collect data and track trends regarding harmful content and user groups. |