Harassment in online games is more than random trash talk. According to a new study by Spain’s Miguel Hernández University, it targets specific identities. The GamerVictim project surveyed over 1,800 Spanish gamers and found one in five had faced sexual victimization in online games.
Women, LGBTQIA+ players, and older gamers are far more likely to face abuse in digital play spaces. This harassment isn’t just sexual. Up to 30% of players report hate-driven insults based on gender, race, politics, or sexual orientation.
Lead researcher Mario Santisteban calls online toxicity “common and damaging,” with real-world consequences that go far beyond the screen.
WHO FACES THE MOST HARASSMENT? WOMEN, QUEER GAMERS, AND OLDER PLAYERS
The study reveals a clear profile of those most at risk in online gaming spaces—especially multiplayer or competitive formats.
Women and LGBTQIA+ players are disproportionately targeted with unwanted sexual messages, threats, and gender-based insults. Older gamers—often sidelined in youth-focused gaming narratives—also report high levels of exclusion, mockery, and verbal abuse.
Gaming frequency matters too: those who play more hours per week report higher exposure to harmful behavior. Sharing personal information increases vulnerability. Players who reveal gender, age, or sexuality are more likely to be harassed during matches. Santisteban says this pattern reflects broader social inequalities replicated in digital spaces—where anonymity fuels harm with little accountability.
HATE SPEECH AND SEXUAL ABUSE AREN’T ISOLATED INCIDENTS
The research team analyzed four major harassment types: social violence (insults and hate speech), sexual harassment, economic abuse, and excessive gaming behaviors. These forms of abuse often overlap. For many, a typical game session may include both hate speech and inappropriate sexual comments.
Roughly 20%–30% of players experienced at least one hate-motivated incident during online play, depending on the form of aggression. The study emphasizes that harassment isn’t occasional—it’s normalized and woven into online gaming culture, especially in high-intensity formats.
In fact, players exposed to frequent abuse often internalize it and repeat toxic behaviors toward others in later games. This cycle of abuse contributes to an environment where harmful language and actions go unchecked—and even expected.
MENTAL HEALTH FALLOUT: THE PSYCHOLOGICAL TOLL OF ONLINE ABUSE
The emotional toll of digital harassment is significant and lasting, especially for gamers who play frequently or are already marginalized. Victims report symptoms like anxiety, depression, social withdrawal, and lowered self-esteem tied to experiences in online games.
Many also say they’ve left games they once loved due to constant harassment, choosing to avoid rather than engage. Others mirror the abuse, becoming aggressors themselves—a troubling feedback loop of trauma and retaliation in gaming communities.
This behavioral echo shows how toxic spaces don’t just harm—they transform victims into perpetrators, spreading hostility even further. Harassment becomes not just normalized—but contagious.
LEAGUE OF LEGENDS STUDY SHOWS JUST HOW COMMON TOXICITY HAS BECOME
The GamerVictim team previously studied League of Legends, one of the world’s most-played competitive games, to analyze in-game toxicity. They examined 328 matches and found 70% included some form of toxic behavior—mostly insults, threats, or disruptive chat conduct.
More extreme hate messages were less frequent. However, the study flagged the routine nature of everyday verbal abuse in online games. This normalization makes detection and moderation difficult, as subtle toxicity often flies under the radar of current reporting systems.
Santisteban argues that highly competitive formats like MOBAs (Multiplayer Online Battle Arenas) intensify this dynamic. This is due to their fast pace and pressure. “Crowded digital spaces, poor moderation, and team stress create a perfect storm for verbal violence,” he says.
GAMING DESIGN AND TECH GAPS ENABLE ABUSE TO THRIVE
Most online games lack robust safety features, leaving moderation to overburdened community managers or ineffective reporting systems. Current design strategies often fail to prevent harassment before it starts or provide meaningful intervention once it occurs.
Players often report harassment, only to find little or no follow-up from game developers or platform moderators. Santisteban warns that passive moderation emboldens aggressors and alienates victims who feel unheard or unprotected in online spaces.
More proactive design—such as early warnings, automated flagging tools, and community-led oversight—could help curb toxic behavior before it escalates. The study also calls for in-game tools that promote prosocial conduct and reward positive engagement, not just punishment after abuse.
DEVELOPERS MUST STEP UP: PREVENTION STARTS WITH DESIGN
Game developers, the researchers argue, are in the best position to implement solutions and reshape gaming spaces into safer environments. They urge companies to prioritize user protection with adaptive game design, improved reporting systems, and accessible in-game support.
Santisteban emphasizes that toxic behavior is not inevitable. “Design can either suppress abuse or enable it. It’s a matter of priorities.” Suggestions include real-time content filters, AI-powered moderation, and transparent penalties for repeat offenders.
Highlighting inclusive avatars, community codes of conduct, and identity-safe customization could also help underrepresented groups feel seen and respected. Simple interface changes—like removing forced voice chat or hiding identity data—can significantly reduce exposure to harm.
A LEGAL PATH FORWARD: REGULATION MAY HELP CLOSE THE GAP
The study highlights the European Union’s Digital Services Act (DSA) as a possible framework for better governance in online gaming. The DSA requires large platforms to mitigate risk, offer transparency, and report their moderation policies to regulators.
Gaming platforms could fall under the DSA’s scope, giving users more legal backing when reporting abuse or unsafe practices. This regulation could push companies to treat safety as seriously as profit, aligning with broader digital rights goals across the EU.
Santisteban believes legal accountability will be a game-changer: “If developers won’t act voluntarily, then law must make them.”
ENDING THE GAME OF HARM
Harassment in gaming isn’t a bug—it’s a feature of systems designed without marginalized users in mind. As the GamerVictim project shows, women, LGBTQ+ individuals, and older players are not only more targeted—they are also more likely to leave.
The cost of silence is exclusion, trauma, and the erosion of once-thriving communities. Ending toxicity won’t happen overnight—but it starts with acknowledging who’s getting hurt, and redesigning the game for them. When games prioritize people over profits, play can become empowering again—not a battlefield of hate, but a space of connection and creativity.

































