Last December, the United Nations warned of an overlooked but critical “emerging terrorist threat”: extremists are radicalizing members of online gaming communities.
Despite the great interest in saving gamers from such exploitation, experts say a lack of research funding on the subject has placed the game industry behind social networks when it comes to counter-terrorism efforts. However, that is starting to change. Last week, researchers told Ars that the U.S. Department of Homeland Security has awarded funding — nearly $700,000 — for the first time to a research group working directly with major game companies to develop effective counter-terrorism methods and protect vulnerable gamers.
The new project will last two years. It is run by Middlebury College’s Institute of International Studies, which houses the Center on Terrorism, Extremism, and Counterterrorism (CTEC). Vice reported that other partners include a nonprofit called Take This — which focuses on the impact of gaming on mental health — and a tech company called Logicically — which Vice says is “solving the problem of bad online behavior on a large scale.” resolves”.
The researchers have summarized their overarching goals for the DHS project as “the development of a set of best practices and centralized resources for monitoring and evaluating extremist activities, as well as a series of training workshops for monitoring, detecting and preventing extremist exploitation in game spaces for community managers, multiplayer designers, lore developers, mechanics designers, and trust and safety professionals.
Take This research director Rachel Kowert told Ars that the primary goal of the project is to develop resources for the game industry. Her group’s ambitious plan is to reach large companies first and then engage smaller companies and independent developers for maximum impact.
Alex Newhouse, CTEC’s deputy director, told Ars that the project will target major gaming companies that “essentially act as social platforms,” including Roblox, Activision Blizzard and Bungie.
While project funding had just been approved, Newhouse said CTEC’s work has already begun. The group has been working with Roblox for six months, and Newhouse said it is also in “very preliminary” discussions with the Entertainment Software Association about ways to expand the project.
Borrowing counter-terrorism methods on social media
Newhouse said the FBI within DHS has become increasingly interested in research like CTEC’s to combat domestic terrorism, but to his knowledge no federal organization has funded such data collection. Although his project will only be funded for two years, Newhouse aims to push the game industry within five years to implement the same anti-extremism standards that social networking platforms already have.
“I want game developers, especially big ones like Roblox and Microsoft, to have dedicated anti-extremist in-game teams,” Newhouse told Ars. “In these days we have to push to be so advanced on the game industry side as well.”
Newhouse plans to rely on his experience to help tech giants like Google and Facebook build counter-terrorism teams. He says CTEC’s top priority is to convince the game industry to invest in proactively moderating extremist content by “implementing increasingly sophisticated proactive detection and moderation systems” that are also used by social networks.
Historically, Newhouse said gaming companies relied primarily on players to report extremist content for moderation. That’s not strategy enough, he said, because radicalization often works by boosting a gamer’s self-esteem, and people manipulated into seeing this kind of online engagement as positive often don’t self-report these radicalizing events. By relying strictly on user reports, gaming companies “will not discover anything at the initial recruiting and radicalization level,” he said.
An associate director of the Anti-Defamation League’s Center for Technology and Society, Daniel Kelley, told Ars that online gaming companies are about 10 years behind social media companies in flagging this issue as critical.
Limited funding for online gaming efforts against extremism
Kowert, of Take This, first became interested in the link between online gaming communities and violent extremism in the real world after coming across a 2019 nationally representative survey from ADL. As it turned out, nearly one in four respondents “were exposed to extremist white supremacist ideology in online games.” Newhouse said that estimate is “probably too conservative” at this point.
Still, ADL said, “evidence of the widespread recruitment or organization of extremists in online game environments (such as in Fortnite or other popular titles) remains anecdotal at best, and more research is needed before broad claims can be made.”
Today, the research base remains limited, but it has become clear that the problem does not only affect adults. When ADL expanded its survey to nearly 100 million respondents in 2021, the survey included young gamers aged 13-17 for the first time. ADL found that 10 percent of young gamers were “exposed to white supremacist ideologies in the context of online multiplayer games.”
Kowert immediately responded to the 2019 ADL report by flipping her investigation and partnering with Newhouse. She told Ars that the reason there is so little research is that there is so little money.
Kelley told Ars that while it’s good to see research finally getting funding, the ADL is recommending the government invest a lot more money to nip the issue in the bud. “Now is not the time to back things up with drop-in-the-bucket funds,” Kelley said. “There’s a lot more the Justice Department needs to do to fund this kind of effort.”
Gaming industry remains ignorant
Kowert told Ars that gaming companies have “legally” remained “unaware” of the magnitude of the problem” of extremism on their platforms, especially since they consider themselves gaming platforms first and social platforms second, Nieuwhuis agreed.
“It’s very, very clear in our conversations with the video game industry that they’re not fully aware of the burgeoning problem they have on their hands,” Newhouse told Ars.
According to Kelley, it’s not just social media counterterrorism efforts that gaming networks need to embrace; Gambling companies could also become safer if there were regulations like those that force social media companies to publish transparency reports. The only gaming company to ever see Kelley publish a transparency report was a small company called Wildlife Studios, which released its first report this year.
“2022 will be the first time we get any kind of transparency reporting from a game company,” Kelley told Ars. “And it’s not from any of the majors. It’s not from EA. It’s not from Riot. It’s not from Blizzard.”
None of the major online gaming companies mentioned here immediately responded to Ars’s request for comment. Kelley said Roblox is the only major gaming company with a public policy on online extremism.
Part of the problem with gaming companies overlooking the issue, Kowert says, comes from the significant research base disproving that online video game content directly affects gamers’ susceptibility to extremism.
The American Psychological Association told Ars that its 2020 report stating that video games do not encourage violent behavior is still the most current statement. But Kowert says focusing discussions on video game content “impedes the conversation.” More attention should be paid to how gamers are socially reached by extremists while playing.
Kelley says CTEC’s research is an important first step toward greater government involvement in the issue, but even the game industry’s ability to adapt to social media standards may be a low bar.
“I think there’s still a long way to go for the social media industry before it has really robust transparency,” Kelley said.
ADL advises online gaming companies to go even further than social platforms when it comes to transparency. ADL wants gaming companies to audit and include statistics about “in-game extremism and toxicity in the Entertainment Software Rating Board’s game rating systems.”
More transparency is exactly what researchers focusing on extremism in online gaming communities need, Newhouse said, because research is also limited by what information is publicly available. However, gaming companies do not always enthusiastically collaborate with researchers. When Newhouse reaches out to gaming companies, he said, sharing data isn’t their instinct, and generally they should be shocked to join efforts to protect users.
“Honestly, we usually have to scare companies into listening to us,” Newhouse told Ars.